Latest TweetsDifference between mod_alias and mod_rewrite perishablepress.com/difference…
Perishable Press

Latest Blacklist Entries

Recently cleared several megabytes of log files, detecting patterns, recording anomalies, and blacklisting gross offenders. Gonna break it down into three sections:

User Agents

User-agents come and go, and are easily spoofed, but it’s worth a few lines of htaccess to block the more persistent bots that repeatedly scan your site with malicious requests.

# Nov 2010 User Agents
SetEnvIfNoCase User-Agent "MaMa " keep_out
SetEnvIfNoCase User-Agent "choppy" keep_out
SetEnvIfNoCase User-Agent "heritrix" keep_out
SetEnvIfNoCase User-Agent "Purebot" keep_out
SetEnvIfNoCase User-Agent "PostRank" keep_out
SetEnvIfNoCase User-Agent "archive.org_bot" keep_out
SetEnvIfNoCase User-Agent "msnbot.htm)._" keep_out

<Limit GET POST PUT>
 Order Allow,Deny
 Allow from all
 Deny from env=keep_out
</Limit>

The first line blocks any user-agent containing “MaMa ”. If that scares you, then replace that line with these two:

SetEnvIfNoCase User-Agent "MaMa CyBer" keep_out
SetEnvIfNoCase User-Agent "MaMa Xirio" keep_out

The other lines block the latest batch of “loser-agents,” which may completely disappear overnight. My current strategy is to block for a few months and then start fresh. Stuff like heritrix, Purebot, and PostRank have made the list numerous times.

Character Strings

There must be some exciting new vulnerability, because suddenly I’m seeing TONS of requests for the following resources in just about every virtual directory imaginable:

fpw.php
xmlpc.php
pingserver.php
test00.comze.com

What’s the best way to deal with endless requests for non-existent resources? I prefer to respond with 403 Forbidden and call it done:

# Nov 2010 Char Strings
<IfModule mod_alias.c>
 RedirectMatch 403 fpw.php
 RedirectMatch 403 xmlpc.php
 RedirectMatch 403 pingserver.php
 RedirectMatch 403 test00.comze.com
</IfModule>

Of course, make sure you aren’t actually using any of these files anywhere on your site before using this code.

IP Addresses

Last but not least, here’s the latest batch of nefarious IP addresses. There’s no reason to block random botnet IPs, so only the most rogue static addresses make the list:

# Nov 2010 IPs
<Limit GET POST PUT>
 Order Allow,Deny
 Allow from all
 Deny from 65.55.3.211
 Deny from 72.229.57.27
 Deny from 77.93.2.81
 Deny from 77.221.130.18 
 Deny from 91.205.96.13
 Deny from 94.75.229.132
 Deny from 95.108.157.252
 Deny from 99.22.93.95
 Deny from 173.193.219.168
 Deny from 174.133.177.66
 Deny from 178.234.154.230
 Deny from 178.33.3.23
 Deny from 190.174.198.86
 Deny from 203.89.212.187
 Deny from 207.241.228.166
 Deny from 213.55.76.224
 Deny from 216.171.98.77
</Limit>

As with the user-agents, I like to block IPs for a month or so at a time. Implement (or not) as you see fit.

Bonus IPs! – Looking for more bad IPs to block? Check out Vladimir’s post in the comments.

Just one fix..

Don’t take my word for it. Check your own logs and see what shouldn’t be there. “Know thy enemy,” as they say ;)

For more help on blocking stuff with .htaccess, check out Eight Ways to Redirect with Apache’s mod_rewrite.

Jeff Starr
About the Author Jeff Starr = Fullstack Developer. Book Author. Teacher. Human Being.
Archives
24 responses
  1. Hello Jeff, using your code for blocking user agents in my htaccess makes the below error on my site, if I remove it everything works fine. Any idea what can be wrong here?

    Best
    Soren

    Internal Server Error

    The server encountered an internal error or misconfiguration and was unable to complete your request.

    Please contact the server administrator, webmaster@digitalworkflow.se and inform them of the time the error occurred, and anything you might have done that may have caused the error.

    More information about this error may be available in the server error log.

    Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.

    Apache/2 Server at www.digitalworkflow.se Port 80

  2. hello i am encountering the same error a soren, it becaue the msnbot line.

  3. vortex,

    I removed the “msnbot” line, it works fine now

    Thanks

    Soren

  4. soren, instead of removing it add a before ), like this:

    SetEnvIfNoCase User-Agent "msnbot.htm)._" keep_out

    cheers.

  5. FreeStuffer February 2, 2011 @ 11:18 pm

    I was having problems with a free for all links script I was running and the .htaccess resources here have helped me sort that out – a BIG BIG THANK YOU!

    One thing – does anyone know of a commented block list?

    Along the lines of……

    Deny from 65.55.3.211 # msnbot-65-55-3-211.search.msn.com

  6. I am bringing up a new WordPress site. I have already implemented your system. Thanks for your help.

    I also want to block access to my theme and plug-in folders to any request not coming from my domain. Will this damage access from legitimate search engine robots?

    Thanks in advance.

  7. not if you block access like this example, it can be done in multiple ways

    RewriteCond %{REQUEST_FILENAME} .*(jpe?g|gif|png|woff|otf|svg|eot)$ [NC]
    RewriteCond %{HTTP_REFERER} !^$
    RewriteCond %{HTTP_REFERER} !(.*.?)your-domain.com [NC]
    RewriteCond %{HTTP_REFERER} !google. [NC]
    RewriteCond %{HTTP_REFERER} !live. [NC]
    RewriteCond %{HTTP_REFERER} !yahoo. [NC]
    RewriteCond %{HTTP_REFERER} !msn. [NC]
    RewriteCond %{HTTP_REFERER} !search?q=cache [NC]
    RewriteRule .* - [F]

  8. I am new to this htaccess stuff so please excuse any dumb questions.

    Can I use this style?

    SetEnvIfNoCase User-Agent "google" allow_in
    SetEnvIfNoCase User-Agent "yahoo" keep_out
    ...
    SetEnvIfNoCase User-Agent "mydomain.com" allow_in
    SetEnvIfNoCase User-Agent "www.mydomain.com" allow_in

    order deny,allow
    deny from all
    allow from env=allow_in

    My thought was to put this .htaccess in the individual directories or is it better done in the root?

  9. In your example should I change the [NC] to [NC OR] ?

  10. i think you should read the apache documentation, i am not a wordpress expert to know what you have in plugins and theme folders that a search bot could use. i just gave you an example on how to block hotlinking [i understood from your comment that this is what you need].

  11. this should make your life alot easier. http://httpd.apache.org/docs/2.0/rewrite/rewrite_guide.html

    switch the version according to what you have installed on server. also if you have suPHP, you need to check the documentation.

  12. I was recently a victim of a hacker and destroyed almost every site I had developed. It was certainly a wake-up call for me to start learning about website security. I came to visit your site via Lynda Secure Sites tutorials and it has opened my eyes to all the “nasties” I had no idea existed. Thanks for doing such a wonderful job explaining everything in simple terms so that a newbie such as myself can understand it all and implement it. I am….Forever grateful!

[ Comments are closed for this post ]