Building the 3G Blacklist, Part 4: Improving the RedirectMatch Directives of the Original 2G Blacklist

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this fourth article, I build upon previous ideas and techniques by improving the directives contained in the original, 2G Blacklist. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving the RedirectMatch Directives of the Original 2G Blacklist In the first version (2G) of the next-generation blacklist, a collection of malicious attack strings […] Read more »

Building the 3G Blacklist, Part 3: Improving Site Security by Selectively Blocking Rogue User Agents

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this third article, I discuss targeted, user-agent blacklisting and present an alternate approach to preventing site access for the most prevalent and malicious user agents. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Selectively Blocking Rogue User Agents Several months ago, while developing improved methods for protecting websites […] Read more »

Building the 3G Blacklist, Part 2: Improving Site Security by Preventing Malicious Query-String Exploits

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this second article, I present an incredibly powerful method for eliminating malicious query string exploits. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Preventing Malicious Query String Exploits A vast majority of website attacks involves appending malicious query strings onto legitimate, indexed URLs. Any webmaster serious about site […] Read more »

Building the 3G Blacklist, Part 1: Improving Site Security by Recognizing and Exploiting Server Attack Patterns

In this series of five articles, I share insights and discoveries concerning website security and protecting against malicious attacks. In this first article of the series, I examine the process of identifying attack trends and using them to immunize against future attacks. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Recognizing and Exploiting Server Attack Patterns Crackers, spammers, scrapers, and other attackers are getting smarter and more […] Read more »

Universal www-Canonicalization via htaccess

During my previous rendezvous involving comprehensive canonicalization for WordPress, I offer my personally customized technique for ensuring consistently precise and accurate URL delivery. That particular method targets WordPress exclusively (although the logic could be manipulated for general use), and requires a bit of editing to adapt the code to each particular configuration. In this follow-up tutorial, I present a basic www-canonicalization technique that accomplishes the following: requires or removes the www prefix for all URLs absolutely no editing when requiring the www prefix minimal amount of editing when removing the www prefix minimal amount of code used to execute either […] Read more »

Blacklist Candidate Number 2008-04-27

Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. Since the implementation of my 2G Blacklist, I have enjoyed a significant decrease in the overall number and variety of site attacks. In fact, I had to time-travel back to March 1st just to find a candidate worthy of this month’s blacklist spotlight. I felt like Rod Roddy looking over the Price-is-Right audience to announce the next name only to discover a quiet, empty room. And then like Bob gets pissed that nobody […] Read more »

How to Block Proxy Servers via htaccess

Not too long ago, a reader going by the name of bjarbj78 asked about how to block proxy servers from accessing her website. Apparently, bjarbj78 had taken the time to compile a proxy blacklist of over 9,000 domains, only to discover afterwards that the formulated htaccess blacklisting strategy didn’t work as expected: deny from proxydomain.com proxydomain2.com Blacklisting proxy servers by blocking individual domains seems like a futile exercise. Although there are a good number of reliable, consistent proxy domains that could be blocked directly, the vast majority of such sites are constantly changing. It would take a team of professionals […] Read more »

Three Unsolved WordPress Mysteries

After several years of using WordPress, I have at least three unanswered questions: What’s up with the WordPress PHP Memory Error? Why do certain phrases trigger “Forbidden” errors when saving or publishing posts? What happened to the Plugin Pages in the WordPress Codex? Let’s have a look at each one of these baffling mysteries.. Read more »

Content Negotiation for XHTML Documents via PHP and htaccess

In this article, I discuss the different MIME types available for XHTML and explain a method for serving your documents with the optimal MIME type, depending on the capacity of the user agent. Using either htaccess or PHP for content negotiation, we can serve complete, standards-compliant markup for our document’s header information. This is especially helpful when dealing with Internet Explorer while serving a DOCTYPE of XHTML 1.1 along with the recommended XML declaration. According to the RFC standards 1 produced by IETF 2, web documents formatted as XHTML 3 may be served as any of the following three MIME types: Read more »

Redirect WordPress Feeds to Feedburner via htaccess (Redux)

In a previous article, I explain how to redirect your WordPress feeds to Feedburner. Redirecting your WordPress feeds to Feedburner enables you to take advantage of their many freely provided, highly useful tracking and statistical services. Although there are a few important things to consider before optimizing your feeds and switching to Feedburner, many WordPress users redirect their blog’s two main feeds — “main content” and “all comments” — using either a plugin or directly via htaccess. Here is the htaccess code as previously provided here at Perishable Press: Read more »

Custom HTTP Errors via htaccess

We all know how important it is to deliver sensible, helpful 404 error pages to our visitors. There are many ways of achieving this functionality, including the well-known htaccess trick used to locally redirect users to custom error pages: # htaccess custom error pages ErrorDocument 400 /errors/400.html ErrorDocument 401 /errors/401.html ErrorDocument 403 /errors/403.html ErrorDocument 404 /errors/404.html ErrorDocument 500 /errors/500.html ..and so on. These directives basically tell Apache to deliver the designated documents for their associated error types. Many webmasters and developers employ this trick to ensure that visitors receive customized error pages that are generally more user-friendly or design-specific than […] Read more »

Blacklist Candidate Number 2008-03-09

Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. Imagine, if you will, an overly caffeinated Bob Barker, hunched over his favorite laptop, feverishly scanning his server access files. Like some underpaid factory worker pruning defective bobble heads from a Taiwanese assembly line, Bob rapidly identifies and isolates suspicious log entries with laser focus. Upon further investigation, affirmed spammers, scrapers and crackers are swiftly blacklisted from future access. For the most heinous offenders, we suddenly hear Rod Roddy’s guzzling voice echo throughout […] Read more »

2G Blacklist: Closing the Door on Malicious Attacks

Since posting the Ultimate htaccess Blacklist and then the Ultimate htaccess Blacklist 2, I find myself dealing with a new breed of malicious attacks. It is no longer useful to simply block nefarious user agents because they are frequently faked. Likewise, blocking individual IP addresses is generally a waste of time because the attacks are coming from a decentralized network of zombie machines. Watching my error and access logs very closely, I have observed the following trends in current attacks: User agents are faked, typically using something generic like “Mozilla/5.0” Each attack may involve hundreds of compromised IP addresses Attacks […] Read more »

Over 150 of the Worst Spammers, Scrapers and Crackers from 2007

Update 2010/07/07: Please visit the 2010 IP Blacklist for more current information. Over the course of each year, I blacklist a considerable number of individual IP addresses. Every day, Perishable Press is hit with countless numbers of spammers, scrapers, crackers and all sorts of other hapless turds. Weekly examinations of my site’s error logs enable me to filter through the chaff and cherry-pick only the most heinous, nefarious attackers for blacklisting. Minor offenses are generally dismissed, but the evil bastards that insist on wasting resources running redundant automated scripts are immediately investigated via IP lookup and denied access via simple […] Read more »

Improve Site Performance by Increasing PHP Memory for WordPress

During the recent ASO server debacle, I raced frantically to restore functionality to Perishable Press. Along the way, one of the many tricks that I tried while trying to fix the dreaded “white screen of death” syndrome involved increasing the amount of PHP memory available to WordPress. This fix worked for me, but may not prove effective on every installation of WordPress. If you are unsure as to whether or not you need to increase your PHP memory, consult with your host concerning current available memory 1 and overall compatibility with a localized increase. Note that if your blog is running […] Read more »

WordPress Error Fix(?): Increase PHP Memory for cache.php

This trick isn’t guaranteed to prevent all WordPress-generated PHP memory errors, but it certainly seems to help reduce their overall occurrence. For some reason, after my host upgraded their servers to Apache 1.3.41, I began logging an extremely high number of fatal PHP “memory exhausted” errors resulting from the WordPress cache.php script. Here is an example of the countless errors that are generated: Read more »

Latest Tweets Plugin launch! Theme switching done right with Theme Switcha: wordpress.org/plugins/theme-sw… #WordPress #plugins pic.twitter.com/7LidbkFHPy