Building the 3G Blacklist, Part 2: Improving Site Security by Preventing Malicious Query-String Exploits

[ 3G Stormtroopers ]

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this second article, I present an incredibly powerful method for eliminating malicious query string exploits. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist.

Improving Site Security by Preventing Malicious Query String Exploits

A vast majority of website attacks involves appending malicious query strings onto legitimate, indexed URLs. Any webmaster serious about site security is well-familiar with the following generalized access-log entries:

http://domain.tld/subdomain/path/to/some/post/file.php?test=http://malicious.domain.tld/attack.php

http://domain.tld/subdomain/another/fine/article/index.php?config=http://wasted.domain.tld/vector.do??

http://domain.tld/subdomain/yet/another/indexed/post/open.php?path=%22http://zombie.domain.tld/test.txt?

Generally, sites that suffer query-string attacks receive hundreds or thousands of such URL requests during relatively short periods of time. Apparently, these URLs are generated automatically via scripts executed from decentralized networks of compromised computers. As discussed elsewhere, these automated “zombie” attacks leave tracks that are difficult to unify via some common element. For example, consider the following:

  • Attackers frequently and randomly disguise declared user agents
  • Scripts are generally executed via random sets of IP addresses
  • Referrer information is typically absent from site access logs
  • Attacks target a wide variety of well-known (indexed) URLs
  • Query strings are usually appended to unpredictable file names
  • Query strings consist of unpredictable character sequences

These dissassociated characteristics make it difficult to successfully predict and thus protect against future attacks. Fortunately, the large number of URL requests employing malicious query strings contain a secondary URL, which is most likely the address of some nasty exploit script. These secondary URLs generally consist of apparently random, unpredictable sequences of characters and terms. Nonetheless, virtually all of these secondary query-string URLs include one of these three transfer protocols:

  • ftp://
  • http://
  • https://

But there’s a catch: when used in secondary URLs, these protocol prefixes are not always complete. Either intentionally or not, URLs in query strings often register improperly, with one or both slashes, the colon, or even portions of the http (or equivalent) omitted entirely. Thus, trying to match any of these character strings too closely will inevitably lead to false negatives.

On the other hand, legitimate inclusion of various transfer protocols is quite common, especially among content management platforms. For example, the deathly popular WordPress passes HTTP referrer data via query strings during various comment management tasks. In the process, the characters “http_” are included in the query string. Thus, matching any of the protocol strings too loosely will inevitably lead to false positives, and worse.

The key to blocking a vast majority of malicious query-string exploits requires precise regex pattern matching, thereby optimizing the ratio of false negatives and false positives. Fortunately, by taking advantage of the QUERY_STRING directive of Apache’s mod_rewrite function, it is possible to effectively and efficiently block a vast majority of malicious query-string exploits.

The Magic Bullet

To implement this elegant blacklist solution, place the following code into either your site’s root htaccess file or your server’s configuration file (Note: this method will also be included in the completed version of the 3G Blacklist):

# BLOCK BAD QUERY STRINGS
<ifmodule mod_rewrite.c>
 RewriteCond %{QUERY_STRING} ftp\:   [NC,OR]
 RewriteCond %{QUERY_STRING} http\:  [NC,OR]
 RewriteCond %{QUERY_STRING} https\: [NC]
 RewriteRule .* -                    [F,L]
</ifmodule>

Upload and test, test, test. I have been testing these directives here at Perishable Press for several weeks now, and have experienced excellent results. This single chunk of code is responsible for a dramatic 80% drop in the overall number of server attacks as recorded in my access and error logs. Highly recommended! ;)

Briefly, let’s examine how this code works. First, notice that we are enclosing the four rewrite rules within an <ifmodule> container. This will prevent the code from crashing your site should the required Apache module ( mod_rewrite ) prove unavailable. Within the test container, we employ three RewriteConditions designed to match three commonly deployed protocols, ftp:, http:, and https:. As previously discussed, these protocols omit the two forward slashes to reduce the number of false negatives. Finally, after matching for the target character strings, the RewriteRule in the last line returns a server status 403 — Forbidden — HTTP error code for all matched URLs. Nice, clean, and easy. ;)

Wrap Up..

Once in place, this method will effectively eliminate a significant amount of malicious query-string attacks. Upon investigation of your access/error logs, you should see a dramatic decrease in the number of rogue 404 Not Found responses, and a dramatic increase in the number of returned 403 Forbidden responses. Overall, this is one the most elegant, effective, and efficient blacklisting techniques I have ever had the pleasure of using. :)

As always, if you are unable to access certain URLs on your site after implementing this method, immediately comment out the three RewriteCond lines and try again. If you then have access, check the URL for a query string containing one of the blocked character strings. Depending on what you find, uncomment and/or edit the code as necessary. If you need further assistance or have specific questions, leave a comment or contact me directly and I will do my best to help you out.

Next..

Stay tuned for the continuation of Building the 3G Blacklist, Part 3: Improving Site Security by Selectively Blocking Rogue User Agents. If you have yet to do so, I highly encourage you to subscribe to Perishable Press. As always, thank you for your generous attention.