Just a quick WordPress snippet for future reference. I recently explained how to disable comments, pingbacks, and trackbacks via SQL. Here’s a good way to do it via PHP: <?php function close_comments( $posts ) { if ( !is_single() ) { return $posts; } if ( time() – strtotime( $posts[0]->post_date_gmt ) > ( 30 * 24 * 60 * 60 ) ) { $posts[0]->comment_status = 'closed'; $posts[0]->ping_status = 'closed'; } return $posts; } add_filter( 'the_posts', 'close_comments' ); ?> You can run […] Continue reading »
As you may observe, the WordPress installation that powers Perishable Press is located in a subdirectory named press. This configuration was intentional, as I wanted to have the option to easily install and maintain multiple versions of WordPress in variously named subdirectories. As much as I enjoy this flexibility, many would argue the SEO-related benefits of installing WordPress in your site’s root directory, or at least making it appear that way by using WordPress’ easily customizable “Blog Address” options setting. Continue reading »
For the past several months and up until just recently, Perishable Press had been suffering from unpredictable episodes of the dreaded white screen of death. Although blank white screens happen to virtually all WordPress users now and then, certain configurations seem to trigger crashes more frequently than others. Here, I am referring to WordPress version 2.3. In this case, the unpredictable crashes, inconsistent errors, and general instability began several months ago after I had completed my WordPress theme restoration project. […] Continue reading »
Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. Just under the wire! Even so, this month’s official Blacklist-Candidate article may be the last monthly installment of the series. Although additional BC articles may appear in the future, it is unlikely that they will continue as a regular monthly feature. Oh sure, I see the tears streaming down your face, but think […] Continue reading »
In the now-complete series, Building the 3G Blacklist, I share insights and discoveries concerning website security and protection against malicious attacks. Each article in the series focuses on unique blacklist strategies designed to protect sites transparently, effectively, and efficiently. The five articles culminate in the release of the next generation 3G Blacklist. Here is a quick summary of the entire Building the 3G Blacklist series: Continue reading »
As you know, HTAccess files are powerful tools for manipulating site performance and functionality. Protecting your site’s HTAccess files is critical to maintaining a secure environment. Fortunately, preventing access to your HTAccess files is very easy. Let’s have a look.. Continue reading »
After much research and discussion, I have developed a concise, lightweight security strategy for Apache-powered websites. Prior to the development of this strategy, I relied on several extensive blacklists to protect my sites against malicious user agents and IP addresses. Over time, these mega-lists became unmanageable and ineffective. As increasing numbers of attacks hit my server, I began developing new techniques for defending against external threats. This work soon culminated in the release of a “next-generation” blacklist that works by […] Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. Wrapping up the series with this article, I provide the final key to our comprehensive blacklist strategy: selectively blocking individual IPs. Previous articles also focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. In the next article, these five articles will culminate in the release of the next generation 3G Blacklist. Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this fourth article, I build upon previous ideas and techniques by improving the directives contained in the original 2G Blacklist. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this third article, I discuss targeted, user-agent blacklisting and present an alternate approach to preventing site access for the most prevalent and malicious user agents. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G […] Continue reading »
In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this second article, I present an incredibly powerful method for eliminating malicious query string exploits. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Security by Preventing Query String Exploits A vast […] Continue reading »
In this series of five articles, I share insights and discoveries concerning website security and protecting against malicious attacks. In this first article of the series, I examine the process of identifying attack trends and using them to immunize against future attacks. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving […] Continue reading »
Welcome to the Perishable Press “Blacklist Candidate” series. In this post, we continue our new tradition of exposing, humiliating and banishing spammers, crackers and other worthless scumbags.. Since the implementation of my 2G Blacklist, I have enjoyed a significant decrease in the overall number and variety of site attacks. In fact, I had to time-travel back to March 1st just to find a candidate worthy of this month’s blacklist spotlight. I felt like Rod Roddy looking over the Price-is-Right audience […] Continue reading »
Not too long ago, a reader going by the name of bjarbj78 asked about how to block proxy servers from accessing her website. Apparently, bjarbj78 had taken the time to compile a proxy blacklist of over 9,000 domains, only to discover afterwards that the formulated htaccess blacklisting strategy didn’t work as expected. Here is the ineffective htaccess directive that was used: Deny from proxydomain.com proxydomain2.com Blacklisting proxy servers by blocking individual domains seems like a futile exercise. Although there are […] Continue reading »
Recently, while restoring my collection of Perishable Press themes, I needed a fast, effective way to randomize a series of images using PHP. After playing around with several possibilities, I devised the following drop-dead easy technique: Continue reading »
We all know how important it is to deliver sensible, helpful 404 error pages to our visitors. There are many ways of achieving this functionality, including the well-known htaccess trick used to locally redirect users to custom error pages: # htaccess custom error pages ErrorDocument 400 /errors/400.html ErrorDocument 401 /errors/401.html ErrorDocument 403 /errors/403.html ErrorDocument 404 /errors/404.html ErrorDocument 500 /errors/500.html ..and so on. These directives basically tell Apache to deliver the designated documents for their associated error types. Many webmasters and […] Continue reading »