Protecting your website is more important than ever. There are a million ways to do it, and this is one of them. In fact, it’s what I use to protect Perishable Press and other key sites. It’s called the 5G Blacklist, and it’s something I’ve been working on for a long time. The idea is simple enough: analyze bad requests and block them using a firewall/blacklist via .htaccess. Now in its 5th generation, the 5G Blacklist has evolved into a […] Continue reading »
Please excuse this self-serving, miscellaneous post, but I’ve just got to purge all of these code snippets and scraps collected over the years. Whenever I update this site, I place any removed/unused code snippets into a giant note file for future reference, just in case. There’s all sorts of different types of code and snippets that just keep growing and growing and.. and finally it gets to a point where I just need to dump everything and start fresh. Welcome […] Continue reading »
If you want to block tough proxies like hidemyass.com, my previously posted .htaccess methods won’t work. Those methods will block quite a bit of proxy visits to your site, but won’t work on the stealthier proxies. Fortunately, we can use a bit of PHP to keep them out. Continue reading »
Quick WordPress tip for easily and quietly blocking a ton of comment spam. Akismet and other programs are good at catching most spam, but every now and then a bunch of weird, foreign-language spam will sneak past the filters and post live to your site. Here’s a good example of the kind of stuff that’s easy to block: Continue reading »
Updating the 4G Blacklist, the new 5G Firewall is now open for beta testing. The new code is better than ever, providing wider protection with less code and fewer false positives. I’ve had much success with this new firewall, but more testing is needed to ensure maximum compatibility and minimal issues. Continue reading »
Like most sites on the Web, Perishable Press is scanned constantly by malicious scripts looking for vulnerabilities and exploit opportunities. There is no end to the type and variety of malicious URL requests. It all depends on the script, the target, and the goal of the attack. Malicious scripts generally seek one of two things: Continue reading »
Recently cleared several megabytes of log files, detecting patterns, recording anomalies, and blacklisting gross offenders. Gonna break it down into three sections: User Agents Character Strings IP Addresses User Agents User-agents come and go, and are easily spoofed, but it’s worth a few lines of htaccess to block the more persistent bots that repeatedly scan your site with malicious requests. # Nov 2010 User Agents SetEnvIfNoCase User-Agent "MaMa " keep_out SetEnvIfNoCase User-Agent "choppy" keep_out SetEnvIfNoCase User-Agent "heritrix" keep_out SetEnvIfNoCase User-Agent […] Continue reading »
The 2010 User-Agent Blacklist blocks hundreds of bad bots while ensuring open-access for the major search engines: Google, Bing, Ask, Yahoo, et al. Blocking bad user-agents is an effective addition to any security strategy. It works like this: your site is getting hammered by rogue bots that waste valuable server resources and bandwidth. So you grab a copy of the 2010 UA Blacklist from Perishable Press, include it in your site’s root .htaccess file, and enjoy better security and performance. […] Continue reading »
One of my favorite security measures here at Perishable Press is the site’s virtual Blackhole trap for bad bots. The concept is simple: include a hidden link to a robots.txt-forbidden directory somewhere on your pages. Bots that ignore or disobey your robots rules will crawl the link and fall into the honeypot trap, which then performs a WHOIS Lookup and records the event in the blackhole data file. Once added to the blacklist data file, bad bots immediately are denied […] Continue reading »
Over the course of each year, I blacklist a considerable number of individual IP addresses. Every day, Perishable Press is hit with countless numbers of spammers, scrapers, crackers and all sorts of other hapless turds. Weekly examinations of my site’s error logs enable me to filter through the chaff and cherry-pick only the most heinous, nefarious attackers for blacklisting. Minor offenses are generally dismissed, but the evil bastards that insist on wasting resources running redundant automated scripts are immediately investigated […] Continue reading »
If you’ve been keeping an eye on your 404 errors recently, you will have noticed an increase in requests for nonexistent mobile files and directories, especially over the past year or so. The scripts and bots requesting these files from your server seem to be looking for a mobile version of your site. Unfortunately, they are wasting bandwidth and resources in the process. It has become common to see the following 404 errors constantly repeated in your log files: http://domain.tld/apple-touch-icon.png […] Continue reading »
Running a private site is all about preventing unwanted visitors. Here is a quick and easy way to allow access to multiple IP addresses while redirecting everyone else to a custom message page. To do this, all you need is an HTAccess file and a list of IPs for which you would like to allow access. Continue reading »
Let’s face it. There’s just as much scum on the Internet as there is out there in the “real world.” Maybe even more, who knows. From scammers and spammers to scrapers and crackers, the Web is just crawling with all sorts of pathetic scumbags. As predictably random as much of the malicious activity happens to be, it is virtually guaranteed that you will be hounded by at least a few persistent IP addresses that, for whatever reason, have latched on […] Continue reading »
Given my propensity to discuss matters involving error log data (e.g., monitoring malicious behavior, setting up error logs, and creating extensive blacklists), I am often asked about the best way to go about monitoring 404 and other types of server errors. While I consider myself to be a novice in this arena (there are far brighter people with much greater experience), I do spend a lot of time digging through log entries and analyzing data. So, when asked recently about […] Continue reading »
You have seen user-agent blacklists, IP blacklists, 4G Blacklists, and everything in between. Now, in this article, for your sheer and utter amusement, I present a collection of over 8000 blacklisted referrers. Shortcut: skip the article and jump to Disclaimer and Download » Referrer Spam Sucks For the uninitiated, in teh language of teh Web, a referrer is the online resource from whence a visitor happened to arrive at your site. For example, if Johnny the Wonder Parrot was visiting the […] Continue reading »
As discussed in my recent article, Eight Ways to Blacklist with Apache’s mod_rewrite, one method of stopping spammers, scrapers, email harvesters, and malicious bots is to blacklist their associated user agents. Apache enables us to target bad user agents by testing the user-agent string against a predefined blacklist of unwanted visitors. Any bot identifying itself as one of the blacklisted agents is immediately and quietly denied access. While this certainly isn’t the most effective method of securing your site against […] Continue reading »