Welcome to the new design! Please report any bugs or issues, thanks :)
Web Dev + WordPress + Security

SEO Experiment: Let Google Sort it Out

One way to prevent Google from crawling certain pages is to use <meta> elements in the <head> section of your web documents. For example, if I want to prevent Google from indexing and archiving a certain page, I would add the following code to the head of my document:

<meta name="googlebot" content="noindex,noarchive" />

I’m no SEO guru, but it is my general understanding that it is possible to manipulate the flow of page rank throughout a site through strategic implementation of <meta> directives.

After thinking about it, I recently decided to remove the strategic <meta> directives from my pages here at Perishable Press. This strategy was initially designed to allow indexing and archiving of only the following pages:

  • The Home Page
  • Single Posts
  • Tag Archives

Everything else included <meta> directives that prevented compliant search engines from indexing and archiving the page. This common SEO formula was implemented over a year ago and the initial results seemed to indicate:

  • Reduced number of pages indexed by Google
  • Reduced number of pages in the supplemental index
  • Slight reduction in overall number of unique visitors
  • Improved average visitor duration (lower bounce rate)

These results have been obscured over time, primarily due to the continued growth of the site. For example, the number of indexed pages has increased, but so has the total number of pages available for indexing. Likewise, the number of unique visitors has increased as well, but apparently due to the greater amount of indexed content. Further, I don’t think the supplemental index even exists these days (does it?).

So, in a bit of an experiment that began earlier this year, I removed the following conditional <meta> code from my theme’s header.php files:

<?php if(is_home() && (!$paged || $paged == 1) || is_tag() || is_single()) { ?>
		<meta name="googlebot" content="index,archive,follow,noodp" />
		<meta name="robots" content="all,index,follow" />
		<meta name="msnbot" content="all,index,follow" />
<?php } else { ?>
		<meta name="googlebot" content="noindex,noarchive,follow,noodp" />
		<meta name="robots" content="noindex,follow" />
		<meta name="msnbot" content="noindex,follow" />
<?php } ?>

Within a week or so after removing these “page-sculpting” directives, I began to notice somewhat of a predictable pattern begin to emerge:

  • Increased number of pages indexed by Google
  • Slight increase in overall number of unique visitors
  • Slight decrease in average visitor duration

These results were expected and seem to make sense:

  • Removing the noindex directives allows Google to include more pages in the index
  • More pages in the index correlates with more unique visitors
  • More combinations of page content result in more irrelevant matches, but these visitors leave quickly and thereby increase bounce rate

At this point, I’m not too concerned one way or another about SEO and trying to get a gazillion hits and tons of page rank. Considering that I operate the site for my own benefit and don’t collect any advertising revenue, I would say that I have more than enough traffic to keep me busy.

The point of this exercise was to experiment with a commonly employed SEO strategy. Given the initial results, it looks like I don’t need the conditional meta directives after all. So for now, rather than fighting it, I think I will just kick back, focus on creating quality content, and let Google take care of itself.

Do you use any similar rank-sculpting techniques? Do they work? It would be great to hear if anyone has experienced a significant statistical impact using similar methods.

Jeff Starr
About the Author
Jeff Starr = Creative thinker. Passionate about free and open Web.
USP Pro: Unlimited front-end forms for user-submitted posts and more.

13 responses to “SEO Experiment: Let Google Sort it Out”

  1. Online Auction Script 2010/01/22 3:07 am

    Hi,
    It’s really wonderful post that’s provided meaningful and resourceful information about SEO experiment .
    This is really beneficial for my learning experiance and it is a really good read for me.

    Thanks for valuable information.

Comments are closed for this post. Something to add? Let me know.
Welcome
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
GA Pro: Add Google Analytics to WordPress like a pro.
Thoughts
2 things I hate to see in stylesheets: _ and #
Love VLC media player but it fails miserably when it comes to randomizing large collections of mp3 and other files.
Dashlane redesigned, stating proudly they "removed all filigree". Should have kept it; the app now looks generic and boring. Killed your identity.
Working on integration for setaPDF + EDD on the new books subdomain. Good times.
Toggle visibility of hidden files on Mac: Cmd + Shift + .
Great tool for checking browser caching for web pages and all included files.
The new minimalist design styled by 14KB of CSS (uncompressed and un-minified). That covers 850+ posts and pages spanning 15 years of diverse content.