After much research and discussion, I have developed a concise, lightweight security strategy for Apache-powered websites. Prior to the development of this strategy, I relied on several extensive blacklists to protect my sites against malicious user agents and IP addresses. Over time, these mega-lists became unmanageable and ineffective. As increasing numbers of attacks hit my server, I began developing new techniques for defending against external threats. This work soon culminated in the release of a “next-generation” blacklist that works by […] Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. Wrapping up the series with this article, I provide the final key to our comprehensive blacklist strategy: selectively blocking individual IPs. Previous articles also focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. In the next article, these five articles will culminate in the release of the next generation 3G Blacklist. Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this fourth article, I build upon previous ideas and techniques by improving the directives contained in the original 2G Blacklist. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this third article, I discuss targeted, user-agent blacklisting and present an alternate approach to preventing site access for the most prevalent and malicious user agents. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G […] Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this second article, I present an incredibly powerful method for eliminating malicious query string exploits. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Security by Preventing Query String Exploits A vast […] Continue reading »
In this series of five articles, I share insights and discoveries concerning website security and protecting against malicious attacks. In this first article of the series, I examine the process of identifying attack trends and using them to immunize against future attacks. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving […] Continue reading »
During my previous rendezvous involving comprehensive canonicalization for WordPress, I offer my personally customized technique for ensuring consistently precise and accurate URL delivery. That particular method targets WordPress exclusively (although the logic could be manipulated for general use), and requires a bit of editing to adapt the code to each particular configuration. In this follow-up tutorial, I present a basic www-canonicalization technique that accomplishes the following: Continue reading »
Recently, while restoring the popular Jupiter! WordPress theme, which several readers use to “skin” the Perishable Press website, I found myself searching for a simple, effective JavaScript technique for toggling element visibility. Specifically, I needed to accomplish the following design goals: Continue reading »
Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. Since the implementation of my 2G Blacklist, I have enjoyed a significant decrease in the overall number and variety of site attacks. In fact, I had to time-travel back to March 1st just to find a candidate worthy of this month’s blacklist spotlight. I felt like Rod Roddy looking over the Price-is-Right audience […] Continue reading »
Following my recent post on CSS code formatting, I was delightfully surprised to have received such insightful, enthusiastic feedback. Apparently, I am not the only person passionate about the subtle nuances involved with the formatting of CSS code. So, to continue the conversation, let’s explore several techniques for writing the opening and closing brackets of CSS declaration blocks. Continue reading »
Not too long ago, a reader going by the name of bjarbj78 asked about how to block proxy servers from accessing her website. Apparently, bjarbj78 had taken the time to compile a proxy blacklist of over 9,000 domains, only to discover afterwards that the formulated htaccess blacklisting strategy didn’t work as expected. Here is the ineffective htaccess directive that was used: Deny from proxydomain.com proxydomain2.com Blacklisting proxy servers by blocking individual domains seems like a futile exercise. Although there are […] Continue reading »
Recently, while restoring my collection of Perishable Press themes, I needed a fast, effective way to randomize a series of images using PHP. After playing around with several possibilities, I devised the following drop-dead easy technique: Continue reading »
After reading my previous article on preloading images without JavaScript1, Nanda pointed out that adding extra markup to preload images is not the best approach, especially where Web Standards are concerned. Mobile devices, for example, may experience problems when dealing with the following preloading technique: /* ADD THIS TO CSS */ div#preloaded-images { position: absolute; overflow: hidden; left: -9999px; top: -9999px; height: 1px; width: 1px; } <!– ADD THIS TO XHTML –> <div id="preloaded-images"> <img src="https://perishablepress.com/image-01.png" width="1" height="1" alt="Image 01" […] Continue reading »
Call me strange, but I format each of my CSS rules according to the following structure/pattern: div#example element { margin: 5px 15px 5px 0; border: 1px solid #444; line-height: 1.5em; text-align: center; background: #222; font-size: 10px; display: block; padding: 5px; color: #888; float: left; } div#another div.example element { border: 1px solid #444; margin: 7px 0 17px 0; letter-spacing: 1px; font-weight: bold; background: #222; font-size: 1.1em; cursor: pointer; display: block; padding: 3px; width: 308px; color: #888; clear: left; float: left; […] Continue reading »
We all know how important it is to deliver sensible, helpful 404 error pages to our visitors. There are many ways of achieving this functionality, including the well-known htaccess trick used to locally redirect users to custom error pages: # htaccess custom error pages ErrorDocument 400 /errors/400.html ErrorDocument 401 /errors/401.html ErrorDocument 403 /errors/403.html ErrorDocument 404 /errors/404.html ErrorDocument 500 /errors/500.html ..and so on. These directives basically tell Apache to deliver the designated documents for their associated error types. Many webmasters and […] Continue reading »
After upgrading WordPress from version 2.0.5 to 2.3.3, I did some experimenting with the “post autosave” feature. The autosave feature uses some crafty ajax to automagically save your post every 2 minutes (120 seconds by default). Below the post-editing field, you will notice a line of text that displays the time of the most recent autosave, similar to the following: Continue reading »