New book on WordPress Theme Development: WordPress Themes In Depth
Web Design
Category Archive

Simple IP-Detection Bad for SEO

In general, Perishable Press enjoys generous ranking in Google’s search-engine results. The site’s many pages bring in lots of traffic for some great keywords, and a direct search for “Perishable Press” returns the first spot, with eight featured site links even. And recently, after switching servers, traffic increased even further. Things were going well, and it seemed like the perfect opportunity to finally renovate and redesign the site. So I dive in.. And then approximately 24-48 hours after beginning work on the new design, BAM – suddenly Google cuts my traffic by 75% and removes most of my pages from […] Read more »

Canonical URLs and Subdomains with Plesk

I am in the process of migrating my sites from A Small Orange to Media Temple. Part of that process involves canonicalizing domain URLs to help maximize SEO strategy. At ASO, URL canonicalization required just a few htaccess directives: # enforce no www prefix <ifmodule mod_rewrite.c> RewriteCond %{HTTP_HOST} !^domain\.tld$ [NC] RewriteRule ^(.*)$ http://domain.tld/$1 [R=301,L] </ifmodule> When placed in the web-accessible root directory’s htaccess file, that snippet will ensure that all requests for your site are not prefixed with www. There’s also a force-www technique if that’s how you roll. Either way, the point is that on most shared hosting, URL […] Read more »

HTTP Headers for ZIP File Downloads

You know when you you’re working on a project and get stuck on something, so you scour the Web for solutions only to find that everyone else seems to be experiencing the exact same thing. Then, after many hours trying everything possible, you finally stumble onto something that seems to work. This time, the project was setting up a secure downloads area for Digging into WordPress. And when I finally discovered a solution, I told myself that it was definitely something I had to share here at Perishable Press. Apparently, there is much to be desired when it comes to […] Read more »

Latest Blacklist Entries

Recently cleared several megabytes of log files, detecting patterns, recording anomalies, and blacklisting gross offenders. Gonna break it down into three sections: User Agents Character Strings IP Addresses User Agents User-agents come and go, and are easily spoofed, but it’s worth a few lines of htaccess to block the more persistent bots that repeatedly scan your site with malicious requests. # Nov 2010 User Agents SetEnvIfNoCase User-Agent “MaMa ” keep_out SetEnvIfNoCase User-Agent “choppy” keep_out SetEnvIfNoCase User-Agent “heritrix” keep_out SetEnvIfNoCase User-Agent “Purebot” keep_out SetEnvIfNoCase User-Agent “PostRank” keep_out SetEnvIfNoCase User-Agent “archive.org_bot” keep_out SetEnvIfNoCase User-Agent “msnbot.htm)._” keep_out <limit GET POST PUT> Order Allow,Deny […] Read more »

Target iPhone and iPad with CSS3 Media Queries

When designing a website, it’s always a good idea to test on as many different platforms, devices, and browsers as possible. These days, pimping your websites for the iPhone and iPad is an important step in the design process. Especially on the iPad, sites tend to look about 20% cooler than on desktop browsers, so you definitely want to take the time to fine-tune the details. And when dealing with iDevices, it’s often necessary to deliver some custom CSS to make everything just right. Want to apply CSS styles to the iPad and iPhone? Here is the plug-n-play, copy-&-paste code […] Read more »

How to Deal with Content Scrapers

Chris Coyier of CSS-Tricks recently declared that people should do “nothing” in response to other sites scraping their content. I totally get what Chris is saying here. He is basically saying that the original source of content is better than scrapers because: it’s on a domain with more trust. you published that article first. it’s coded better for SEO than theirs. it’s better designed than theirs. it isn’t at risk for serious penalization from search engines. If these things are all true, then I agree, you have nothing to worry about. Unfortunately, that’s a tall order for many sites on […] Read more »

Country, Regional, and State Abbreviations

Creating dropdown menus for web forms is such a fun way to spend the afternoon. One of the funnest things for me is adding all of the regional, state, and country codes when they’re required. Here are a few lists to make my web-dev life a little easier. Here’s a quick jump menu: Country and Regional Abbreviations US State Abbreviations US States Download as plain text file Read more »

2010 User-Agent Blacklist

Update: Check out the new and improved 2013 User Agent Blacklist! The 2010 User-Agent Blacklist blocks hundreds of bad bots while ensuring open-access for the major search engines: Google, Bing, Ask, Yahoo, et al. Blocking bad user-agents is an effective addition to any security strategy. It works like this: your site is getting hammered by rogue bots that waste valuable server resources and bandwidth. So you grab a copy of the 2010 UA Blacklist from Perishable Press, include it in your site’s root .htaccess file, and enjoy a more secure and better performing website. It’s that easy. Proven Security The […] Read more »

Best Method for Email Obfuscation?

Awhile ago, Silvan Mühlemann conducted a 1.5 year experiment whereby different approaches to email obfuscation were tested for effectiveness. Nine different methods were implemented, with each test account receiving anywhere from 1800 to zero spam emails. Here is an excerpt from the article: When displaying an e-mail address on a website you obviously want to obfuscate it to avoid it getting harvested by spammers. But which obfuscation method is the best one? I drove a test to find out. After reading through the article and its many findings, here are what seem to be the best methods for obfuscating email […] Read more »

Protect Your Site with a Blackhole for Bad Bots

One of my favorite security measures here at Perishable Press is the site’s virtual Blackhole trap for bad bots. The concept is simple: include a hidden link to a robots.txt-forbidden directory somewhere on your pages. Bots that ignore or disobey your robots rules will crawl the link and fall into the trap, which then performs a WHOIS Lookup and records the event in the blackhole data file. Once added to the blacklist data file, bad bots immediately are denied access to your site. I call it the “one-strike” rule: bots have one chance to follow the robots.txt protocol, check the […] Read more »

htaccess Code for WordPress Multisite

For the upcoming Digging into WordPress update for WordPress 3.0, I have been working with WordPress’ multisite functionality. Prior to version 3.0, WordPress came in two flavors: “original” and “multisite” (MU). Most designers probably work with regular, one-blog installations of “regular” WordPress. The htaccess rules for all single-blog installations of WordPress haven’t changed. They are the same for WordPress 3.0 as they are for all previous versions. But now that multisite has merged with regular-flavored WordPress, we can stick with single-blog installs (which is how things are setup by default), or we can activate multisite functionality and create an unlimited […] Read more »

2010 IP Blacklist

Update: Check out the new and improved 2013 IP Blacklist! Over the course of each year, I blacklist a considerable number of individual IP addresses. Every day, Perishable Press is hit with countless numbers of spammers, scrapers, crackers and all sorts of other hapless turds. Weekly examinations of my site’s error logs enable me to filter through the chaff and cherry-pick only the most heinous, nefarious attackers for blacklisting. Minor offenses are generally dismissed, but the evil bastards that insist on wasting resources running redundant automated scripts are immediately investigated via IP lookup and denied access via simple htaccess directive: […] Read more »

How to Micro-Optimize Your CSS

There are many ways to optimize your web pages. In addition to reducing HTTP requests and delivering compressed files, we can also minify code content. The easiest way to minify your CSS is to run it through an online code minifier, which automatically eliminates extraneous characters to reduce file size. Minification shrinks file size significantly, by as much as 30% or more (depending on input code). This size-reduction is the net result of numerous micro-optimization techniques applied to your stylesheet. By learning these techniques and integrating them into your coding practice, you’ll create better-optimized CSS during the development process. Sharper […] Read more »

HTML5 Table Template

A good designer knows that tables should not be used for layout, but rather for displaying columns and rows of data. HTML enables the creation of well-structured, well-formatted tables, but they’re used infrequently enough to make remembering all of the different elements and attributes rather time-consuming and tedious. So to make things easier, here is a clean HTML5 template to speed-up development for your next project: Update, June 9th: Here is an absolute basic vanilla template for any markup language. Update, June 9th: Here is a better template for HTML5. For HTML 4.01 or anything XHTML-flavor, use the following template: […] Read more »

Wrapping Long URLs and Text Content with CSS

To wrap long URLs, strings of text, and other content, just apply this carefully crafted chunk of CSS code to any block-level element (e.g., perfect for <pre></pre> tags): pre { white-space: pre; /* CSS 2.0 */ white-space: pre-wrap; /* CSS 2.1 */ white-space: pre-line; /* CSS 3.0 */ white-space: -pre-wrap; /* Opera 4-6 */ white-space: -o-pre-wrap; /* Opera 7 */ white-space: -moz-pre-wrap; /* Mozilla */ white-space: -hp-pre-wrap; /* HP Printers */ word-wrap: break-word; /* IE 5+ */ } See demonstration Explanation By default, the white-space property is set to normal. So you might see something like this when trying to […] Read more »

htaccess Redirect to Maintenance Page

Redirecting visitors to a maintenance page or other temporary page is an essential tool to have in your tool belt. Using HTAccess, redirecting visitors to a temporary maintenance page is simple and effective. All you need to redirect your visitors is the following code placed in your site’s root HTAccess: # MAINTENANCE-PAGE REDIRECT <ifmodule mod_rewrite.c> RewriteEngine on RewriteCond %{REMOTE_ADDR} !^123\.456\.789\.000 RewriteCond %{REQUEST_URI} !/maintenance.html$ [NC] RewriteCond %{REQUEST_URI} !\.(jpe?g?|png|gif) [NC] RewriteRule .* /maintenance.html [R=302,L] </ifmodule> That is the official copy-&-paste goodness right there. Just grab, gulp and go. Of course, there are a few more details for those who may be unfamiliar […] Read more »

Latest Tweets Download the latest nightly build of WordPress: wordpress.org/nightly-builds/w…