In the now-complete series, Building the 3G Blacklist, I share insights and discoveries concerning website security and protection against malicious attacks. Each article in the series focuses on unique blacklist strategies designed to protect sites transparently, effectively, and efficiently. The five articles culminate in the release of the next generation 3G Blacklist. Here is a quick summary of the entire Building the 3G Blacklist series: Continue reading »
While developing the 3G Blacklist, I completely renovated the Perishable Press site-root and blog-root HTAccess files. Since the makeover, I have enjoyed better performance, fewer errors, and cleaner code. In this article, I share some of the changes made to the root HTAccess file and provide a brief explanation as to their intended purpose and potential benefit. In sharing this information, I hope to inspire others to improve their own HTAccess and/or configuration files. In the next article, I will […] Continue reading »
As you know, HTAccess files are powerful tools for manipulating site performance and functionality. Protecting your site’s HTAccess files is critical to maintaining a secure environment. Fortunately, preventing access to your HTAccess files is very easy. Let’s have a look.. Continue reading »
After much research and discussion, I have developed a concise, lightweight security strategy for Apache-powered websites. Prior to the development of this strategy, I relied on several extensive blacklists to protect my sites against malicious user agents and IP addresses. Over time, these mega-lists became unmanageable and ineffective. As increasing numbers of attacks hit my server, I began developing new techniques for defending against external threats. This work soon culminated in the release of a “next-generation” blacklist that works by […] Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. Wrapping up the series with this article, I provide the final key to our comprehensive blacklist strategy: selectively blocking individual IPs. Previous articles also focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. In the next article, these five articles will culminate in the release of the next generation 3G Blacklist. Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this fourth article, I build upon previous ideas and techniques by improving the directives contained in the original 2G Blacklist. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this third article, I discuss targeted, user-agent blacklisting and present an alternate approach to preventing site access for the most prevalent and malicious user agents. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G […] Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this second article, I present an incredibly powerful method for eliminating malicious query string exploits. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Security by Preventing Query String Exploits A vast […] Continue reading »
In this series of five articles, I share insights and discoveries concerning website security and protecting against malicious attacks. In this first article of the series, I examine the process of identifying attack trends and using them to immunize against future attacks. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving […] Continue reading »
During my previous rendezvous involving comprehensive canonicalization for WordPress, I offer my personally customized technique for ensuring consistently precise and accurate URL delivery. That particular method targets WordPress exclusively (although the logic could be manipulated for general use), and requires a bit of editing to adapt the code to each particular configuration. In this follow-up tutorial, I present a basic www-canonicalization technique that accomplishes the following: Continue reading »
Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. Since the implementation of my 2G Blacklist, I have enjoyed a significant decrease in the overall number and variety of site attacks. In fact, I had to time-travel back to March 1st just to find a candidate worthy of this month’s blacklist spotlight. I felt like Rod Roddy looking over the Price-is-Right audience […] Continue reading »
Not too long ago, a reader going by the name of bjarbj78 asked about how to block proxy servers from accessing her website. Apparently, bjarbj78 had taken the time to compile a proxy blacklist of over 9,000 domains, only to discover afterwards that the formulated htaccess blacklisting strategy didn’t work as expected. Here is the ineffective htaccess directive that was used: Deny from proxydomain.com proxydomain2.com Blacklisting proxy servers by blocking individual domains seems like a futile exercise. Although there are […] Continue reading »
We all know how important it is to deliver sensible, helpful 404 error pages to our visitors. There are many ways of achieving this functionality, including the well-known htaccess trick used to locally redirect users to custom error pages: # htaccess custom error pages ErrorDocument 400 /errors/400.html ErrorDocument 401 /errors/401.html ErrorDocument 403 /errors/403.html ErrorDocument 404 /errors/404.html ErrorDocument 500 /errors/500.html ..and so on. These directives basically tell Apache to deliver the designated documents for their associated error types. Many webmasters and […] Continue reading »
After upgrading WordPress from version 2.0.5 to 2.3.3, I did some experimenting with the “post autosave” feature. The autosave feature uses some crafty ajax to automagically save your post every 2 minutes (120 seconds by default). Below the post-editing field, you will notice a line of text that displays the time of the most recent autosave, similar to the following: Continue reading »
After investigating some unusual 404 errors the other day, I found myself digging through the WordPress Admin Area trying to locate the “Subscribe to Comments” options panel. As it turns out, administrative options for the Subscribe to Comments plugin are split into two different areas. First, the S2C plugin provides configuration options under the WordPress General Settings > “Subscribe to Comments”, which enables users to tweak everything from subscription messages to custom CSS styles. New to me was the other […] Continue reading »
For future reference, this article covers each of the many ways to access your WordPress-generated feeds1. Several different URL formats are available for the various types of WordPress feeds — posts, comments, and categories — for both permalink and default URL structures. For each example, replace “http://example.com/” with the URL of your blog. Note: even though your blog’s main feed is accessible through many different URLs, there are clear benefits to using a single, consistent feed URL throughout your site. […] Continue reading »