Learn how to rock with WordPress! Save 20% on any of my WordPress books w/ coupon code NEWYEAR16 Shop Books »

Controlling Proxy Access with HTAccess

In my recent article on blocking proxy servers, I explain how to use HTAccess to deny site access to a wide range of proxy servers. The method works great, but some readers want to know how to allow access for specific proxy servers while denying access to as many other proxies as possible. Fortunately, the solution is as simple as adding a few lines to my original proxy-blocking method. Specifically, we may allow any requests coming from our whitelist of proxy servers by testing Apache’s HTTP_REFERER variable, like so: RewriteCond %{HTTP_REFERER} !(.*)allowed-proxy-01.domain.tld(.*) RewriteCond %{HTTP_REFERER} !(.*)allowed-proxy-02.domain.tld(.*) RewriteCond %{HTTP_REFERER} !(.*)allowed-proxy-03.domain.tld(.*) Read more »

Eight Ways to Blacklist with Apache’s mod_rewrite

With the imminent release of the next series of (4G) blacklist articles here at Perishable Press, now is the perfect time to examine eight of the most commonly employed blacklisting methods achieved with Apache’s incredible rewrite module, mod_rewrite. In addition to facilitating site security, the techniques presented in this article will improve your understanding of the different rewrite methods available with mod_rewrite. Blacklist via Request Method This first blacklisting method evaluates the client’s request method. Every time a client attempts to connect to your server, it sends a message indicating the type of connection it wishes to make. There are […] Read more »

PHP Short Open Tag: Convenient Shortcut or Short Changing Security?

Most of us learned how to use “echo()” in one of our very first PHP tutorials. That was certainly the case for me. As a consequence, I never really had a need to visit PHP’s documentation page for echo(). On a recent visit to Perishable Press, I saw a Tumblr post from Jeff about the use of PHP’s shortcut syntax for echo() but somewhere deep in my memory, there lurked a warning about its use. I decided to investigate. Read more »

Yahoo! Lies about Obeying Robots.txt Directives

There are two possibilities here: Yahoo!’s Slurp crawler is broken or Yahoo! lies about obeying Robots directives. Either case isn’t good. Slurp just can’t seem to keep its nose out of my private business. And, as I’ve discussed before, this happens all the time. Here are the two most recent offenses, as recorded in the log file for my blackhole spider trap: Read more »

Blacklist Candidate 2008-10-19

Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. From time to time on the show, a contestant places a bid that is so absurd and so asinine that you literally laugh out loud, point at the monitor, and openly ridicule the pathetic loser. On such occasions, even the host of the show will laugh and mock the idiocy. Of course, this same situation happens frequently here at Perishable Press, where the scumbags that manage to escape the 3G Blacklist are proving […] Read more »

Blacklist Candidate Series Summary

An ongoing series of articles on the fine art of malicious exploit detection and prevention. Learn about preventing the sneaky mischievous and deceptive practices of some of the worst spammers, scrapers, crackers, and other scumbags on the Internet. Read more »

Evil Incarnate, but Easily Blocked

As my readers know, I spend a lot of time digging through error logs, preventing attacks, and reporting results. Occasionally, some moron will pull a stunt that deserves exposure, public humiliation, and banishment. There is certainly no lack of this type of nonsense, as many of you are well-aware. 3G Blacklist Even so, I have to admit that I am very happy with my latest strategy against crackers, spammers, and other scumbags, namely, the 3G Blacklist. Since implementing this effective HTAccess security method, I have seen a dramatic decrease in the overall volume of malicious activity recorded in my Apache, […] Read more »

Yahoo! Once Again Caught Disobeying Robots.txt Rules

Hmmm.. Let’s see here. Google can do it. MSN/Live can do it. Even Ask can do it. So why oh why can’t Yahoo’s grubby Slurp crawler manage to adhere to robots.txt crawl directives? Just when I thought Yahoo! finally figured it out, I discover more Slurp tracks in my Blackhole trap for bad spiders: Read more »

Unexplained Crawl Behavior Involving Tagged Query Strings

I need your help! I am losing my mind trying to solve another baffling mystery. For the past three or four months, I have been recording many 404 Errors generated from msnbot, Yahoo-Slurp, and other spider crawls. These errors result from invalid requests for URLs containing query strings such as the following: https://perishablepress.com/press/page/2/?tag=spam https://perishablepress.com/press/page/3/?tag=code https://perishablepress.com/press/page/2/?tag=email https://perishablepress.com/press/page/2/?tag=xhtml https://perishablepress.com/press/page/4/?tag=notes https://perishablepress.com/press/page/2/?tag=flash https://perishablepress.com/press/page/2/?tag=links https://perishablepress.com/press/page/3/?tag=theme https://perishablepress.com/press/page/2/?tag=press ..plus hundreds and hundreds more 1. The URL pattern is always the same: a different page number followed by a query string containing one of the tags used here at Perishable Press, for example: “/?tag=something”. The problem is that there are […] Read more »

Blacklist Candidate Number 2008-05-31

Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. Just under the wire! Even so, this month’s official Blacklist-Candidate article may be the last monthly installment of the series. Although additional BC articles may appear in the future, it is unlikely that they will continue as a regular monthly feature. Oh sure, I see the tears streaming down your face, but think about it: this is actually good news. After implementing the 3G Blacklist, finding decent blacklist candidates is becoming increasingly difficult. […] Read more »

Series Summary: Building the 3G Blacklist

In the now-complete series, Building the 3G Blacklist, I share insights and discoveries concerning website security and protection against malicious attacks. Each article in the series focuses on unique blacklist strategies designed to protect sites transparently, effectively, and efficiently. The five articles culminate in the release of the next generation 3G Blacklist. For the record, here is a quick summary of the entire Building the 3G Blacklist series: Read more »

Improve Site Security by Protecting HTAccess Files

As you know, HTAccess files are powerful tools for manipulating site performance and functionality. Protecting your site’s HTAccess files is critical to maintaining a secure environment. Fortunately, preventing access to your HTAccess files is very easy. Let’s have a look.. Different Methods If you search around the Web, you will probably find several different methods of protecting your HTAccess files. Here are a few examples, along with a bit of analysis: Case-sensitive protection — As far as I know, this is the most widespread method of protecting HTAccess files. Very straightforward, this code will prevent anyone from accessing any file […] Read more »

Perishable Press 3G Blacklist

After much research and discussion, I have developed a concise, lightweight security strategy for Apache-powered websites. Prior to the development of this strategy, I relied on several extensive blacklists to protect my sites against malicious user agents and IP addresses. Over time, these mega-lists became unmanageable and ineffective. As increasing numbers of attacks hit my server, I began developing new techniques for defending against external threats. This work soon culminated in the release of a “next-generation” blacklist that works by targeting common elements of decentralized server attacks. Consisting of a mere 37 lines, this “2G” Blacklist provided enough protection to […] Read more »

Building the 3G Blacklist, Part 5: Improving Site Security by Selectively Blocking Individual IPs

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. Wrapping up the series with this article, I provide the final key to our comprehensive blacklist strategy: selectively blocking individual IPs. Previous articles also focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. In the next article, these five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Selectively Blocking Individual IPs The final component of the 3G Blacklist establishes a vehicle through which individual IPs may be blocked. As […] Read more »

Building the 3G Blacklist, Part 4: Improving the RedirectMatch Directives of the Original 2G Blacklist

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this fourth article, I build upon previous ideas and techniques by improving the directives contained in the original, 2G Blacklist. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving the RedirectMatch Directives of the Original 2G Blacklist In the first version (2G) of the next-generation blacklist, a collection of malicious attack strings […] Read more »

Latest Tweets Blackhole for Bad Bots version 3.0! perishablepress.com/blackhole-… #antispam #security