Spring Sale! Save 30% on all books w/ code: PLANET24
Web Dev + WordPress + Security

Eight Ways to Block and Redirect with Apache’s mod_rewrite

With the imminent release of the next series of (4G) blacklist articles here at Perishable Press, now is the perfect time to examine eight of the most commonly employed blacklisting methods achieved with Apache’s incredible rewrite module, mod_rewrite. In addition to facilitating site security, the techniques presented in this article will improve your understanding of the different rewrite methods available with Apache mod_rewrite.

Note: I changed the title of this post from “Eight Ways to Blacklist..” to “Eight Ways to Redirect..”. The post uses the art of blacklisting to show how to redirect with Apache’s mod_rewrite. The examples may be modified to redirect (or block) just about anything. Check out the section Handling matched requests for details.

Blacklist via Request Method

[ #1 ] This first blacklisting method evaluates the client’s request method. Every time a client attempts to connect to your server, it sends a message indicating the type of connection it wishes to make. There are many different types of request methods recognized by Apache. The two most common methods are GET and POST requests, which are required for “getting” and “posting” data to and from the server. In most cases, these are the only request methods required to operate a dynamic website. Allowing more request methods than are necessary increases your site’s vulnerability. Thus, to restrict the types of request methods available to clients, we use this block of Apache directives:

<IfModule mod_rewrite.c>
 RewriteEngine On
 ServerSignature Off
 Options +FollowSymLinks
 RewriteCond %{REQUEST_METHOD} ^(delete|head|trace|track) [NC]
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

The key to this rewrite method is the REQUEST_METHOD in the rewrite condition. First we invoke some precautionary security measures, and then we evaluate the request method against our list of prohibited types. Apache will then compare each client request method against the blacklisted expressions and subsequently deny access to any forbidden requests. Here we are blocking delete and head because they are unecessary, and also blocking trace and track because they violate the same-origin rules for clients. Of course, I encourage you to do your own research and establish your own request-method security policy.

Blacklist via the Request

[ #2 ] The next blacklisting method is based on the client’s request. When a client attempts to connect to the server, it sends a full HTTP request string that specifies the request method, request URI, and transfer-protocol version. Note that additional headers sent by the browser are not included in the request string. Here is a typical example:

GET blog/index.html HTTP/1.1

This long request string may be checked against a list of prohibited characters to protect against malicious requests and other exploitative behavior. Here is an example of sanitizing client requests by way of Apache’s THE_REQUEST variable:

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{THE_REQUEST} ^.*(\\r|\\n|%0A|%0D).* [NC]
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

Here we are evaluating the entire client-request string against a list of prohibited entities. While there are many character strings common to malicious requests, this example focuses on the prevention of HTTP response splitting, cross-site scripting attacks, cache poisoning, and similar dual-header exploits. Although these are some of the most common types of attacks, there are many others. I encourage you to check your server logs, do some research, and sanitize accordingly.

Blacklist via the Referrer

[ #3 ] Blacklisting via the HTTP referrer is an excellent way to block referrer spam, defend against penetration tests, and protect against other malicious activity. The HTTP referrer is identified as the source of an incoming link to a web page. For example, if a visitor arrives at your site through a link they found via Google, the referrer would be the Google page from whence the visitor came. Sounds straightforward, and it is.

Unfortantely, one of the biggest spam problems on the Web involves the abuse of HTTP referrer data. In order to improve search-engine rank, spambots will repeatedly visit your site using their spam domain as the referrer. The referrer is generally faked, and the bots frequently visit via HEAD requests for the sake of efficiency. If the target site publicizes their access logs, the spam sites will receive a search-engine boost from links in the referrer statistics.

Fortunately, by taking advantage of mod_rewrite’s HTTP_REFERER variable, we can forge a powerful, customized referrer blacklist. Here’s our example:

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{HTTP_REFERER} ^(.*)(<|>|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]
 RewriteCond %{HTTP_REFERER} ^http://(www\.)?.*(-|.)?adult(-|.).*$  [NC,OR]
 RewriteCond %{HTTP_REFERER} ^http://(www\.)?.*(-|.)?poker(-|.).*$  [NC,OR]
 RewriteCond %{HTTP_REFERER} ^http://(www\.)?.*(-|.)?drugs(-|.).*$  [NC]
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

Same basic pattern as before: check for the availability of the rewrite module, enable the rewrite engine, and then specify the prohibited character strings using the HTTP_REFERER variable and as many rewrite conditions as necessary. In this case, we are blocking a series of potentially malicious characters in the first condition, and then blocking any referrer containing the terms “adult”, “poker”, or “drugs”. Of course, we may blacklist as many referrer strings as needed by simply emulating the existing rewrite conditions. Just don’t get carried away — I have seen some referrer blacklists that are over 4000 lines long!

Blacklist via Cookies

[ #4 ] Protecting your site against malicious cookie exploits is greatly facilitated by using Apache’s HTTP_COOKIE variable. HTTP cookies are chunks of data sent by the server to the web client upon initialization. The browser then sends the cookie information back to the server for each subsequent visit. This enables the server to authenticate users, track sessions, and store preferences. A common example of the type of functionality enabled by cookies is the shopping cart. Information about the items placed in a user’s shopping cart may be stored in a cookie, thereby enabling server scripts to respond accordingly.

Generally, a cookie consists of a unique string of alphanumeric text and persists for the duration of a user’s session. Apache’s mod_cookie module generates cookie values randomly and upon request. Once a cookie has been set, it may be used as a database key for further processing, behavior logging, session tracking, and much more. Unfortunately, this useful technology may be abused by attackers to penetrate and infiltrate your server’s defenses. Cookie-based protocols are vulnerable to a variety of exploits, including cookie poisoning, cross-site scripting, and cross-site cooking. By adding malicious characters, scripts, and other content to cookies, attackers may find and exploit sensitive vulnerabilities.

The good news is that we may defend against most of this nonsense by using Apache’s HTTP_COOKIE variable to blacklist characters known to be associated with malicious cookie exploits. Here is an example that does the job:

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{HTTP_COOKIE} ^.*(<|>|'|%0A|%0D|%27|%3C|%3E|%00).* [NC]
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

This is as straightforward as it looks. Check for the required rewrite module, enable the rewrite engine, and deny requests for any HTTP_COOKIEs containing the specified list of prohibited characters. In this list you will see characters generally required to execute any sort of scripted attack: opening and closing angle brackets, single quotation marks, and a variety of hexadecimal equivalents. Feel free to expand this list with additional characters as you see fit. As always, recommendations are most welcome.

Blacklist via Request URI

[ #5 ] Use of Apache’s REQUEST_URI variable is frequently seen in conjunction with URL canonicalization. The REQUEST_URI variable targets the requested resource specified in the full HTTP request string. Thus, we may use Apache’s THE_REQUEST variable to target the entire request string (as discussed above), while using the REQUEST_URI variable to target the actual request URI. For example, the REQUEST_URI variable refers to the “blog/index.html” portion of the following, full HTTP request line:

GET blog/index.html HTTP/1.1

For canonicalization purposes, this is exactly the type of information that must be focused on and manipulated in order to achieve precise, uniform URLs. Likewise, for blacklisting malicious request activity such as the kind of nonsense usually exposed in your server’s access and error logs, targeting, evaluating, and denying malicious URL requests is easily accomplished by taking advantage of Apache’s REQUEST_URI variable.

As you can imagine, blacklisting via REQUEST_URI is an excellent way to eliminate scores of malicious behavior. Here is an example that includes some of the same characters and strings that are blocked in the 4G Blacklist:

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{REQUEST_URI} ^.*(,|;|:|<|>|">|"<|\.\.\.).*     [NC,OR]
 RewriteCond %{REQUEST_URI} ^.*(\=|\@|\[|\]|\^|\`|\{|\}|\~).* [NC,OR]
 RewriteCond %{REQUEST_URI} ^.*(\'|%0A|%0D|%27|%3C|%3E|%00).* [NC]
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

Again, same general pattern of directives as before, only this time we are specifying forbidden characters via the REQUEST_URI variable. Here we are denying any URL requests containing invalid characters, including different types of brackets, various punctuational characters, and some key hexadecimal equivalents. Of course, the possibilities are endless, and the blacklist should be customized according to your specific security strategy and unfolding blacklisting needs.

Blacklist via the User Agent

[ #6 ] Blacklisting via user-agent is a commonly seen strategy that yields questionable results. The concept of blacklisting user-agents revolves around the idea that every browser, bot, and spider that visits your server identifies itself with a specific user-agent character string. Thus, user-agents associated with malicious, unfriendly, or otherwise unwanted behavior may be identified and blacklisted in order to prevent against future access. This is a well-known blacklisting strategy that has resulted in some extensive and effective user-agent blacklists.

Of course, the downside to this method involves the fact that user-agent information is easily forged, making it difficult to know for certain the true identity of blacklisted clients. By simply changing their user-agent to an unknown identity, malicious bots may bypass every blacklist on the Internet. Many evil “scumbots” indeed do this very thing, which explains the incredibly vast number of blacklisted user-agents. Even so, there are certain limits to the extent to which certain user-agent strings may be changed. For example, GNU’s Wget and the cURL command-line tool are difficult to forge, and many other clients have hard-coded user-agent strings that are difficult to change.

On Apache servers, user-agents are easily identified and blacklisted via the HTTP_USER_AGENT variable. Here is an example:

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{HTTP_USER_AGENT} ^$                                                              [OR]
 RewriteCond %{HTTP_USER_AGENT} ^.*(<|>|'|%0A|%0D|%27|%3C|%3E|%00).*                            [NC,OR]
 RewriteCond %{HTTP_USER_AGENT} ^.*(HTTrack|clshttp|archiver|loader|email|nikto|miner|python).* [NC,OR]
 RewriteCond %{HTTP_USER_AGENT} ^.*(winhttp|libwww\-perl|curl|wget|harvest|scan|grab|extract).* [NC]
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

This method works just like the others: check for the mod_rewrite module, enable the rewrite engine, and proceed to deny access to any user-agent that includes any of the blacklisted character strings in its name. As with our previous blacklisting techniques, here we are prohibiting angle brackets, single quotation marks, and various hexadecimal equivalents.

Additionally, we include a handful of user-agent strings commonly associated with server attacks and other malicious behavior. We certainly don’t need anything associated with libwww-perl hitting our server, and many of the others are included in just about every user-agent blacklist that you can find. There are tons of other nasty user-agent scumbots out there, so feel free to beef things up with a few of your own.

Blacklist via the Query String

[ #7 ] Protecting your server against malicious query-string activity is extremely important. Whereas static URLs summon pages, their appended query strings transmit data and pass variables throughout the domain. Query-string information interacts with scripts and databases, influencing behavior and determining results. This relatively open channel of communication is easily accessible and prone to external manipulation. By altering data and inserting malicious code, attackers may penetrate and exploit your sever directly through the query string.

Fortunately, we can protect our server against malicious query-string exploits with the help of Apache’s invaluable QUERY_STRING variable. By taking advantage of this variable, we can ensure the legitimacy and quality of query-string input by screening out and denying access to a known collection of potentially harmful character strings. Here is an example that will keep our query strings squeaky clean:

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{QUERY_STRING} ^.*(localhost|loopback|127\.0\.0\.1).*                                [NC,OR]
 RewriteCond %{QUERY_STRING} ^.*(\.|\*|;|<|>|'|"|\)|%0A|%0D|%22|%27|%3C|%3E|%00).*                 [NC,OR]
 RewriteCond %{QUERY_STRING} ^.*(md5|benchmark|union|select|insert|cast|set|declare|drop|update).* [NC]
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

As you can see, here we are using the QUERY_STRING variable to check all query-string input against a list of prohibited alphanumeric characters strings. This strategy will deny access to any URL-request that includes a query-string containing localhost references, invalid punctuation, hexadecimal equivalents, and various SQL commands. Blacklisting these enitities protects us from common cross-site scripting (XSS), remote shell attacks, and SQL injection. And, while this a good start, it pales in comparison to the new query-string directives of the upcoming 4G Blacklist. ;)

Update: For more information, check out Redirect Query String via .htaccess.

Blacklist via IP Address

[ #8 ] Last but certainly not least, we can blacklist according to IP address. Blacklisting sites based on IP is probably the oldest method in the book and works great for denying site access to stalkers, scrapers, spammers, trolls, and many other types of troublesome morons. The catch is that the method only works when the perpetrators are coming from the same location. An easy way to bypass any IP blacklist is to simply use a different ISP or visit via proxy server. Even so, there is no lack of mindless creeps out there roaming the Internet, who sit there, using the same machine, day after day, relentlessly harassing innocent websites. For these types of lazy, no-life losers, blacklisting via IP address is the perfect solution. Here is a hypothetical example demonstrating several ways to blacklist IPs:

# block individual IPs
<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{REMOTE_ADDR} ^123\.456\.789\.1 [OR]
 RewriteCond %{REMOTE_ADDR} ^456\.789\.123\.2 [OR]
 RewriteCond %{REMOTE_ADDR} ^789\.123\.456\.3
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

# block ranges of IPs
<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{REMOTE_ADDR} ^123\. [OR]
 RewriteCond %{REMOTE_ADDR} ^456\.789\. [OR]
 RewriteCond %{REMOTE_ADDR} ^789\.123\.456\.
 RewriteRule ^(.*)$ - [F,L]
</IfModule>

# alt block IP method
<Limit GET POST PUT>
 Order Allow,Deny
 Allow from all
 Deny from 123.
 Deny from 123.456.
 Deny from 123.456.789.0
</Limit>

In the first block, we are blacklisting three specific IP addresses using Apache’s mod_rewrite and its associated REMOTE_ADDR variable. Each of the hypothetical IPs listed represent a specific, individual address. Then, in the next code block, we are blocking three different ranges of IPs by omitting numerical data from the targeted IP string. In the first line we target any IP beginning with “123.”, which is an enormous number of addresses. In the second line, we block a different, more restrictive range by including the second portion of the address. Finally, in the third line, we block a different, much smaller range of IPs by including a third portion of the address.

Then, just for kicks, I threw in an alternate method of blocking IPs. This is an equally effective method that enables you to block IP addresses and ranges as specifically as necessary. Each deny line pattern-matches according to the specified IP string.

Handling matched requests

In each of these eight blacklisting techniques, we respond to all blacklisted visitors with the server’s default “403 Forbidden” error (via the [F] flag). This server’s default 403 page serves its purpose and requires very little to deliver in terms of system resources, however there is much more that you can do with any matched HTTP requests. Here are a few examples:

Redirect to home page

More subtle than the 403 error, this redirect strategy routes blocked traffic directly to the home page. To use, replace the RewriteRule directive (i.e., the entire line) with the following code:

RewriteRule ^(.*)$ http://your-domain.tld/ [R=302,L]

Notice that here we are specifying the status code via the [R=302] flag. Feel free use any valid status code (e.g., 301 “Permanent”). For more information, check out my tutorial, Redirect Query String via .htaccess.

Redirect to external site

The possibilities here are endless. Just make sure you think twice about the destination, as any scum that you redirect to another site will be seen as coming from your own. Even so, here is the code that you would use to replace the RewriteRule directive in any of the examples above:

RewriteRule ^(.*)$ http://external-domain.tld/some-target/page.html [R=302,L]

Redirect them back to their own site

This is one of my favorites. It’s like having a magic shield that reflects attacks back at the attacker. Send a clear message by using this code as the RewriteRule directive in any of our blacklisting methods:

RewriteRule ^(.*)$ http://%{REMOTE_ADDR}/ [R=302,L]

Custom processing

For those of you with a little skill, it is possible to redirect your unwelcome guests to a fail-safe page that explains the situation to the client while logging all of the information behind the scenes. This is perhaps the most useful approach for understanding your traffic and developing an optimal security strategy. The code would look something like this, depending on your file name and its location:

RewriteRule ^(.*)$ /home/path/blacklisting-script.php [R=302,L]

Closure

This article presents eight effective techniques for protecting your server and preventing malicious behavior. While each of these methods may be used individually, they are designed to secure different aspects of your environment and thus provide a more complete type of firewall protection when combined into a synergized whole. Even when combining these techniques, however, keep in mind that blacklisting various protocols serves to complement a more robust and comprehensive security strategy. Once understood, these methods provide the average webmaster an easy, effective way of defending against unwanted behavior and enhancing the overall security of their sites.

About the Author
Jeff Starr = Fullstack Developer. Book Author. Teacher. Human Being.
Digging Into WordPress: Take your WordPress skills to the next level.

61 responses to “Eight Ways to Block and Redirect with Apache’s mod_rewrite”

  1. Jeff Starr 2010/11/15 6:18 pm

    Hi Jennifer,

    Yes very true, there are many ways to fashion your Apache directives, but this article treats the eight techniques as independently functioning examples that people can grab and modify without having to read the entire article.

    A good example of combining these different techniques is seen in these htaccess blacklists:

    Thanks for the great questions! :)

  2. I have a website that is highly flooded last days.
    The flood comes trough scripts that generates fake url requests to my site. I want to block them but i dont now how can u please help me to block theese types of requests

    109.193.54.12 - - [23/Jan/2011:20:16:32 -0500] "GET /index.php?app=forums&module=ajax&section=markasread&secure_key=4qgvsqett1glepq2o5p1exqazhndd6hl&i=1&forumid=471&secure_key=4qgv sqett1glepq2o5p1exqazhndd6hl HTTP/1.0" 403 432 "-" "Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2.13) Gecko/20101203 YFF35 Firefox/3.6.13"

    OR

    95.174.209.84 - - [23/Jan/2011:20:31:54 -0500] "GET /index.php?app=forums&module=ajax&section=markasread&secure_key=tkqnx2x8ceo1ipglifpmo7qepd7s1ogo&i=1&forumid=361&secure_key=tkqnx2x8ceo1ipglifpmo7qepd7s1ogo HTTP/1.0" 403 432 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; Maxthon; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET4.0C; .NET4.0E; InfoPath.2; Creative AutoUpdate v1.10.10)"

    OR

    79.170.163.222 - - [23/Jan/2011:20:30:13 -0500] "GET /public/style_images/test/topt-bg.gif HTTP/1.0" 200 9746 "http://forum.soundarea.org/index.php?/forum/236-gevkpsyrln/page__prune_day__100__sort_by__A-Z__sort_key__starter_name__topicfilter__all__st__1380" "Opera/9.80 (Windows NT 5.1; U; ru) Presto/2.7.62 Version/11.00"

    OR

    92.72.151.58 - - [23/Jan/2011:20:30:46 -0500] "GET /index.php?/forum/98-tuaeiwmbzh/page__prune_day__100__sort_by__A-Z__sort_key__starter_name__topicfilter__all__st__1540 HTTP/1.0" 200 26817 "-" "Opera/9.80 (Windows NT 6.1; U; ru) Presto/2.7.62 Version/11.00"

    OR

    2.37.132.232 - - [23/Jan/2011:20:30:45 -0500] "GET /index.php?/forum/638-vlyxkruhxe/page__prune_day__100__sort_by__Z-A__sort_key__last_poster_name__topicfilter__all__st__1420 HTTP/1.0" 200 23762 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; BTRS28059; GTB6.6; MRA 5.7 (build 3757); MRSPUTNIK 2, 3, 0, 288; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30

    Thank you in advance!

  3. Jeff, I am looking at your code for blocking by User-Agent. Here is the rule that you have written for putting into a VHost file:

    <IfModule mod_rewrite.c>
     RewriteEngine On
     RewriteCond %{HTTP_USER_AGENT} ^$ [OR]
     RewriteCond %{HTTP_USER_AGENT} ^.*(<|>|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]
     RewriteCond %{HTTP_USER_AGENT} ^.*(HTTrack|clshttp|archiver|loader|email|nikto|miner|python).* [NC,OR]
     RewriteCond %{HTTP_USER_AGENT} ^.*(winhttp|libwww\-perl|curl|wget|harvest|scan|grab|extract).* [NC]
     RewriteRule ^(.*)$ - [F,L]
    </IfModule>

    What I want to know, is what does this line mean:

    RewriteCond %{HTTP_USER_AGENT} ^$ [OR]

    It appears to block any request where the user agent is not specified, or is specified only as “-“. Can you elaborate on what exactly this line means. I just read the mod_rewrite documentation and that is the only thing I can’t piece together.

    The issue is that I want to block bots, but we have a piece of software that we are writing to send requests to our server and it seems to send it’s user-agent header in as “-“. I have corrected the issue with one of our programmers and now the newest version of the software sends out it’s name as the user-agent header, but the old versions of the software (which are out in the field) still don’t have a proper user-agent string, and I can’t block them.

  4. Ok after some further investigation, it seems that I have confirmed my suspicion, and that this line will block empty User Agent headers.

    RewriteCond %{HTTP_USER_AGENT} ^$ [OR]

    From this site: http://johannburkard.de/blog/www/spam/block-empty-user-agent.html

    I think it would be good if you put something in your tutorial here indicating what that line does, as some people may try to use it and be unaware that they are blocking empty User Agent headers. I realize the merits behind doing it, and that in a lot of cases it is probably a smart idea to do it if you are blacklisting by User Agent, however, I think it’s important for people to know that they are doing that, and based on reading your description of what I was doing with those lines of code, I didn’t realize that I was blocking empty User Agents.

  5. I was looking in the ‘block ranges of IP addresses’ to be able to redirect all Chinese visitors to my site to a different page (they can’t access Vimeo or YouTube and I want to offer them the download anyway, but only them; the others can access the normal video). But I can’t seem to get it happening. When I paste the code into the .htaccess file and upload, I get a 403 access denied, whatever IP-address I include. Any suggestions? Sorry I have to bother you (I’m no pro in this field :-), but this does seem like the best site around. Cheers, Ali

  6. What is it specifically that you are adding to htaccess that is causing the error? (Please wrap each line of code with <code> tags before posting comment). If you are using any of the code from the article, keep in mind that they are meant as examples only – using them may require further customization.

  7. Very nice article!
    If you don’t mind a .htaccess noob asking a question related to block some spam freaks on my site…

    I do have a lot of spam coming from two different hosts:

    unassigned.psychz.net

    and

    173.234.159.194.rdns.ubiquityservers.com

    last one with a lot of different ip adresses so my htaccess looks like e.g.:

    Deny from ubiquityservers.com
    Deny from rdns.ubiquityservers.com
    Deny from 69.147.227.178
    Deny from 69.147.227.194

    I don’t know if that is the correct syntax for stopping those two referers from accessing my site at least I guess it isn’t since they still come back without any problem. So maybe a little help or suggestion would be highly appriciated…

    Kindest regards,
    s!

  8. hi Jeff

    Some nice use of the rewrite and rewritecond. I’ve spent a bit of time looking for a solution with no luck, hopefully you may have some experience that will help.

    I wish to rewrite the request method.

    for example if the request method is DELETE “/file/file-to-delete.txt”

    I would like to rewrite it with
    MOVE “/file/file-to-delete.txt” “/file/deleted/file-to-delete.txt”

    Any suggestions or links would be greatly appreciated.

    Regards
    Michael

  9. You are missing the awesome mod_rewrite DB way, so that you can build your DB dynamically.

    http://perlcode.org/tutorials/apache/attacks.html

  10. In my case, the IP range blocking doesn’t seem to work, if there is a $ at the end of the directive. For example, this will work:

    RewriteCond %{REMOTE_ADDR} ^207.46.

    But this won’t:

    RewriteCond %{REMOTE_ADDR} ^207.46.$

    • Jeff Starr 2012/04/28 1:34 pm

      Good call, Jot Es — I went ahead and removed the $ from the IP directives. Thanks for the feedback.

  11. Nice article… Just one question… How I can block a subdimain referer? eg..

    I would like to block requests from the follwoing subdomain @ blogspot.

    sattotal.blogspot.com

    I have used the following rewrite rule, but isn’t working…

    RewriteCond %{HTTP_REFERER} sattotal [NC]
    RewriteRule ^(.*)$ - [F,L]

    • The .htaccess code looks solid.. make sure you’re placing the code in the .htaccess file located in the root of your subdomain :)

  12. Hey Jeff, cool post with cooler style.

    Quick question about restricting cookies that are too long. In the case of a jsessionid DoS attack on a JVM’s memory, how would I reject pages (or drop the jsessionid) if the cookie is too long? (where request length isn’t a good measure here)

    Assume it’s the typical apache + mod_jk + tomcat.

    • Hmm, good question.. something like this should work:

      RewriteCond %{HTTP_COOKIE} ([a-zA-Z0-9]{255}) [NC]
      RewriteRule .* - [F,L]

      Then you would customize by setting 255 to whatever number of characters makes sense.

Comments are closed for this post. Something to add? Let me know.
Welcome
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
The Tao of WordPress: Master the art of WordPress.
Thoughts
I live right next door to the absolute loudest car in town. And the owner loves to drive it.
8G Firewall now out of beta testing, ready for use on production sites.
It's all about that ad revenue baby.
Note to self: encrypting 500 GB of data on my iMac takes around 8 hours.
Getting back into things after a bit of a break. Currently 7° F outside. Chillz.
2024 is going to make 2020 look like a vacation. Prepare accordingly.
First snow of the year :)
Newsletter
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.