For the past week, I’ve been monitoring activity from a set of IP addresses involved with brute-force login attacks. Brute-force login attacks involve systematic guessing of passwords using various common usernames such as “admin” and “username”. So for example, an attack will target an array of sites, use “admin” as the username, and then make numerous attempts at “guessing” your password. And to obfuscate their malicious activity, the attack is executed from multiple IP addresses, either via proxy or possibly […] Continue reading »
I love a good story. Almost as much as I enjoy securing websites. Put them together and you’ve got suspense, intrigue, and plenty of encoded gibberish. But no happy ending this time, in this case the smartest decision was to “pull it” and rebuild. The site was just wasted — completely riddled with malicious code. Without current backup data, it would’ve been “game over” for the site, and possibly the business. Continue reading »
Working on the next version of the G-Series Blacklist, I needed a way to match a wide variety of UTF-8-encoded (hex) character strings. Those familiar with their site’s traffic will recognize this particular type of URI request string, which is typically associated with malicious server scanning, exploits, and other malicious behavior. As I explain in this post, pattern-matching and blocking the blank-space, or whitespace character in URL-requests is an effective way to improve the security of your website. Continue reading »
Cool trick that you may not have known about.. it’s possible to get case-insensitive matching with the powerful RedirectMatch directive. Normally, you would just write your redirect as something like this: Continue reading »
Ill requests and malicious scans have been spiking recently, to the point where server performance was really taking a hit. One scan in particular hammered the server with thousands of bad requests in just a few minutes. There are people out there with strong scripts and small minds that are constantly scanning sites for vulnerabilities, and much of what I’ve seen is aimed primarily at WordPress. Continue reading »
The 5G Blacklist helps reduce the number of malicious URL requests that hit your website. It’s one of many ways to improve the security of your site and protect against evil exploits, bad requests, and other nefarious garbage. If you’re tired of all the slow, bloated security plugins and expensive 3rd-party services, the 5G Blacklist is a solid solution to help protect your Apache-powered site. Continue reading »
By design the 5G Blacklist works on Apache servers, but thanks to Scott Stawarz, here is a version for Microsoft IIS. Disclaimer: I do not use any Microsoft server stuff, so make sure to properly test everything before running this code on a live/production site. Also, if you scroll down to the end of this article, you will find some useful bonus snippets. Continue reading »
Protecting your website is more important than ever. There are a million ways to do it, and this is one of them. In fact, it’s what I use to protect Perishable Press and other key sites. It’s called the 5G Blacklist, and it’s something I’ve been working on for a long time. The idea is simple enough: analyze bad requests and block them using a firewall/blacklist via .htaccess. Now in its 5th generation, the 5G Blacklist has evolved into a […] Continue reading »
Please excuse this self-serving, miscellaneous post, but I’ve just got to purge all of these code snippets and scraps collected over the years. Whenever I update this site, I place any removed/unused code snippets into a giant note file for future reference, just in case. There’s all sorts of different types of code and snippets that just keep growing and growing and.. and finally it gets to a point where I just need to dump everything and start fresh. Welcome […] Continue reading »
If you want to block tough proxies like hidemyass.com, my previously posted .htaccess methods won’t work. Those methods will block quite a bit of proxy visits to your site, but won’t work on the stealthier proxies. Fortunately, we can use a bit of PHP to keep them out. Continue reading »
Quick WordPress tip for easily and quietly blocking a ton of comment spam. Akismet and other programs are good at catching most spam, but every now and then a bunch of weird, foreign-language spam will sneak past the filters and post live to your site. Here’s a good example of the kind of stuff that’s easy to block: Continue reading »
Updating the 4G Blacklist, the new 5G Firewall is now open for beta testing. The new code is better than ever, providing wider protection with less code and fewer false positives. I’ve had much success with this new firewall, but more testing is needed to ensure maximum compatibility and minimal issues. Continue reading »
Like most sites on the Web, Perishable Press is scanned constantly by malicious scripts looking for vulnerabilities and exploit opportunities. There is no end to the type and variety of malicious URL requests. It all depends on the script, the target, and the goal of the attack. Malicious scripts generally seek one of two things: Continue reading »
I am in the process of migrating my sites from A Small Orange to Media Temple. Part of that process involves canonicalizing domain URLs to help maximize SEO strategy. At ASO, URL canonicalization required just a few htaccess directives: # enforce no www prefix <ifmodule mod_rewrite.c> RewriteCond %{HTTP_HOST} !^domain\.tld$ [NC] RewriteRule ^(.*)$ http://domain.tld/$1 [R=301,L] </ifmodule> When placed in the web-accessible root directory’s htaccess file, that snippet will ensure that all requests for your site are not prefixed with www. There’s […] Continue reading »
Recently cleared several megabytes of log files, detecting patterns, recording anomalies, and blacklisting gross offenders. Gonna break it down into three sections: User Agents Character Strings IP Addresses User Agents User-agents come and go, and are easily spoofed, but it’s worth a few lines of htaccess to block the more persistent bots that repeatedly scan your site with malicious requests. # Nov 2010 User Agents SetEnvIfNoCase User-Agent "MaMa " keep_out SetEnvIfNoCase User-Agent "choppy" keep_out SetEnvIfNoCase User-Agent "heritrix" keep_out SetEnvIfNoCase User-Agent […] Continue reading »
The 2010 User-Agent Blacklist blocks hundreds of bad bots while ensuring open-access for the major search engines: Google, Bing, Ask, Yahoo, et al. Blocking bad user-agents is an effective addition to any security strategy. It works like this: your site is getting hammered by rogue bots that waste valuable server resources and bandwidth. So you grab a copy of the 2010 UA Blacklist from Perishable Press, include it in your site’s root .htaccess file, and enjoy better security and performance. […] Continue reading »