New book on WordPress Theme Development: WordPress Themes In Depth

Case-Insensitive RedirectMatch

Cool trick that you may not have known about.. it’s possible to get case-insensitive matching with the powerful RedirectMatch directive. Normally, you would just write your redirect as something like this: Read more »

WordPress Add-on for 5G Blacklist

Ill requests and malicious scans have been spiking recently, to the point where server performance was really taking a hit. One scan in particular hammered the server with thousands of bad requests in just a few minutes. There are people out there with strong scripts and small minds that are constantly scanning sites for vulnerabilities, and much of what I’ve seen is aimed primarily at WordPress. Read more »

Redirect WordPress Date Archives with .htaccess

Restructuring a WordPress website may involve removing the subdomain from URLs/permalinks. For example, I recently removed the original WP-install subdirectory from Perishable Press to simplify site structure and optimize WordPress permalinks. There are PHP scripts and WP plugins that might work for this, but in most cases .htaccess is optimal when changing URL structure and redirecting traffic. Here’s a quick example to help visualize the concept: Read more »

5G Blacklist 2012

Update: Check out the new and improved 5G Blacklist 2013! The 5G Blacklist helps reduce the number of malicious URL requests that hit your website. It’s one of many ways to improve the security of your site and protect against evil exploits, bad requests, and other nefarious garbage. After extensive beta testing, the 5G Blacklist/Firewall is solid and ready to help secure sites hosted on Apache servers. In addition to beta testing for the 5G, this is the 5th major update of my “G”-series blacklists. Here is a quick overview of its evolution: Read more »

Building the 5G Blacklist

Protecting your website is more important than ever. There are a million ways to do it, and this is one of them. In fact, it’s what I use to protect Perishable Press and other key sites. It’s called the 5G Blacklist, and it’s something I’ve been working on for a long time. The idea is simple enough: analyze bad requests and block them using a firewall/blacklist via .htaccess. Now in its 5th generation, the 5G Blacklist has evolved into a considerably solid method of keeping your site safe and secure. How does it work? I’m glad you asked.. Read more »

Optimizing WordPress Permalinks with htaccess

Okay, so Summer’s over, kids are back in school, and I’m finding all sorts of free time to continue writing and posting. One of my Summer projects involved updating & optimizing one of my old project sites, DeadLetterArt.com. It was basically a huge clean-up session that included lots of content consolidation and permalink restructuring. So that’s the topic of this post, how to use htaccess to optimize WordPress permalinks. I’ll go through some htaccess techniques and explain how they can improve your WordPress-powered site. Read more »

Huge Collection of Code Snippets: HTAccess, PHP, WordPress, jQuery, HTML, CSS

Please excuse this self-serving, miscellaneous post, but I’ve just got to purge all of these code snippets and scraps collected over the years. Whenever I update this site, I place any removed/unused code snippets into a giant note file for future reference, just in case. There’s all sorts of different types of code and snippets that just keep growing and growing and.. and finally it gets to a point where I just need to dump everything and start fresh. That is the purpose of this post. Read more »

Block Tough Proxies

If you want to block tough proxies like hidemyass.com, my previously posted .htaccess methods won’t work. Those methods will block quite a bit of proxy visits to your site, but won’t work on the stealthier proxies. Fortunately, we can use a bit of PHP to keep them out. Read more »

Upload Large Files or Die Trying

I recently spent some time wrestling with various e-commerce/shopping-cart/membership plugins. One of them was of course the popular WP e-Commerce plugin, which uses a directory named “downloadables” to store your precious goods. I had some large files that needed to go into this folder, but the server’s upload limit stopped me from using the plugin’s built-in file uploader to do so. Read more »

Humans.txt

One thing I love about Twitter is the instant feedback. For the past few weeks I’ve been seeing lots of 404 requests like this: http://perishablepress.com/humans.txt http://perishablepress.com/humans.txt http://perishablepress.com/humans.txt At first I thought it was some skript kiddie getting creative, you know as a play on the robots.txt file, which is also located in the root of many websites. So it seemed interesting enough to tweet about: Read more »

Clean Up Malicious Links with HTAccess

I recently spent some time analyzing Perishable Press pages as they appear in the search results for Google, Bing, et al. Google Webmaster Tools provides a wealth of information about crawl errors, as well as the URLs of any pages that link to missing content. Combined with your site’s access/error logs, you have everything needed to track down 404 errors and clean up your listings in the search engine results. Read more »

5G Firewall Beta

Update: Check out the new and improved 5G Blacklist 2013! (The beta version provided in this post is now for reference only.) Updating the 4G Blacklist, the new 5G Firewall is now open for beta testing. The new code is better than ever, providing wider protection with less code and fewer false positives. I’ve had much success with this new firewall, but more testing is needed to ensure maximum compatibility and minimal issues. Read more »

Canonical URLs and Subdomains with Plesk

I am in the process of migrating my sites from A Small Orange to Media Temple. Part of that process involves canonicalizing domain URLs to help maximize SEO strategy. At ASO, URL canonicalization required just a few htaccess directives: # enforce no www prefix <ifmodule mod_rewrite.c> RewriteCond %{HTTP_HOST} !^domain\.tld$ [NC] RewriteRule ^(.*)$ http://domain.tld/$1 [R=301,L] </ifmodule> When placed in the web-accessible root directory’s htaccess file, that snippet will ensure that all requests for your site are not prefixed with www. There’s also a force-www technique if that’s how you roll. Either way, the point is that on most shared hosting, URL […] Read more »

Latest Blacklist Entries

Recently cleared several megabytes of log files, detecting patterns, recording anomalies, and blacklisting gross offenders. Gonna break it down into three sections: User Agents Character Strings IP Addresses User Agents User-agents come and go, and are easily spoofed, but it’s worth a few lines of htaccess to block the more persistent bots that repeatedly scan your site with malicious requests. # Nov 2010 User Agents SetEnvIfNoCase User-Agent “MaMa ” keep_out SetEnvIfNoCase User-Agent “choppy” keep_out SetEnvIfNoCase User-Agent “heritrix” keep_out SetEnvIfNoCase User-Agent “Purebot” keep_out SetEnvIfNoCase User-Agent “PostRank” keep_out SetEnvIfNoCase User-Agent “archive.org_bot” keep_out SetEnvIfNoCase User-Agent “msnbot.htm)._” keep_out <limit GET POST PUT> Order Allow,Deny […] Read more »

How to Deal with Content Scrapers

Chris Coyier of CSS-Tricks recently declared that people should do “nothing” in response to other sites scraping their content. I totally get what Chris is saying here. He is basically saying that the original source of content is better than scrapers because: it’s on a domain with more trust. you published that article first. it’s coded better for SEO than theirs. it’s better designed than theirs. it isn’t at risk for serious penalization from search engines. If these things are all true, then I agree, you have nothing to worry about. Unfortunately, that’s a tall order for many sites on […] Read more »

2010 User-Agent Blacklist

Update: Check out the new and improved 2013 User Agent Blacklist! The 2010 User-Agent Blacklist blocks hundreds of bad bots while ensuring open-access for the major search engines: Google, Bing, Ask, Yahoo, et al. Blocking bad user-agents is an effective addition to any security strategy. It works like this: your site is getting hammered by rogue bots that waste valuable server resources and bandwidth. So you grab a copy of the 2010 UA Blacklist from Perishable Press, include it in your site’s root .htaccess file, and enjoy a more secure and better performing website. It’s that easy. Proven Security The […] Read more »

Latest Tweets Plugin update! User Submitted Posts now w/ better performance, custom form templates and 14 new action/filter hooks: bit.ly/1CBVbPL