Fall Sale! Code FALL2024 takes 25% OFF our Pro Plugins & Books »
Web Dev + WordPress + Security

SEO Experiment: Let Google Sort it Out

One way to prevent Google from crawling certain pages is to use <meta> elements in the <head> section of your web documents. For example, if I want to prevent Google from indexing and archiving a certain page, I would add the following code to the head of my document:

<meta name="googlebot" content="noindex,noarchive" />

I’m no SEO guru, but it is my general understanding that it is possible to manipulate the flow of page rank throughout a site through strategic implementation of <meta> directives.

After thinking about it, I recently decided to remove the strategic <meta> directives from my pages here at Perishable Press. This strategy was initially designed to allow indexing and archiving of only the following pages:

  • The Home Page
  • Single Posts
  • Tag Archives

Everything else included <meta> directives that prevented compliant search engines from indexing and archiving the page. This common SEO formula was implemented over a year ago and the initial results seemed to indicate:

  • Reduced number of pages indexed by Google
  • Reduced number of pages in the supplemental index
  • Slight reduction in overall number of unique visitors
  • Improved average visitor duration (lower bounce rate)

These results have been obscured over time, primarily due to the continued growth of the site. For example, the number of indexed pages has increased, but so has the total number of pages available for indexing. Likewise, the number of unique visitors has increased as well, but apparently due to the greater amount of indexed content. Further, I don’t think the supplemental index even exists these days (does it?).

So, in a bit of an experiment that began earlier this year, I removed the following conditional <meta> code from my theme’s header.php files:

<?php if(is_home() && (!$paged || $paged == 1) || is_tag() || is_single()) { ?>
		<meta name="googlebot" content="index,archive,follow,noodp" />
		<meta name="robots" content="all,index,follow" />
		<meta name="msnbot" content="all,index,follow" />
<?php } else { ?>
		<meta name="googlebot" content="noindex,noarchive,follow,noodp" />
		<meta name="robots" content="noindex,follow" />
		<meta name="msnbot" content="noindex,follow" />
<?php } ?>

Within a week or so after removing these “page-sculpting” directives, I began to notice somewhat of a predictable pattern begin to emerge:

  • Increased number of pages indexed by Google
  • Slight increase in overall number of unique visitors
  • Slight decrease in average visitor duration

These results were expected and seem to make sense:

  • Removing the noindex directives allows Google to include more pages in the index
  • More pages in the index correlates with more unique visitors
  • More combinations of page content result in more irrelevant matches, but these visitors leave quickly and thereby increase bounce rate

At this point, I’m not too concerned one way or another about SEO and trying to get a gazillion hits and tons of page rank. Considering that I operate the site for my own benefit and don’t collect any advertising revenue, I would say that I have more than enough traffic to keep me busy.

The point of this exercise was to experiment with a commonly employed SEO strategy. Given the initial results, it looks like I don’t need the conditional meta directives after all. So for now, rather than fighting it, I think I will just kick back, focus on creating quality content, and let Google take care of itself.

Do you use any similar rank-sculpting techniques? Do they work? It would be great to hear if anyone has experienced a significant statistical impact using similar methods.

About the Author
Jeff Starr = Fullstack Developer. Book Author. Teacher. Human Being.
BBQ Pro: The fastest firewall to protect your WordPress.

13 responses to “SEO Experiment: Let Google Sort it Out”

  1. Donace | TheNexus 2009/05/10 9:46 am

    For The Nexus, I utilize no follow and no index so to keep only the front page, single posts / pages ‘indexed’.

    Since doing this I have seen a rise in visitor durations to what it is now namely 83% of all visitors stay for 20 mins plus. Though 79,5% of all visitors look at only one page.

    The same principle goes to my use of no-follow and ‘PR sculpting’.

    So on the assumption PR = better SEO (as it shows a SE metric on quality)

    I could argue that ‘sculpting’ noindex/follow/archive does work as your site hold 11k links and is PR4 (homepage) and mine is 10.5k and is PR5 and therefore it seems to work.

    HOWEVER I have 251 pages cached (101 pages and the rest remnants of old tags etc) while you have 1540 pages cached (though this including tags / date and category archives.)

    So looking at that it is arguable that due to the extra pages the ‘link juice’ is diluted more in your site.

    So if we then look at the % of pages with PR

    A quick in depth PR analysis shows that out of the 101 pages on TN 40/101 (39.6%) have PR 2-5 (1 at PR5) while here 357/1540 (23.2%) have PR 1-5 (12 posts at PR5).

    So following on the (possibly flawed) premiss PR = better SEO. It LOOKS like it works as PR is more prominent on TN the PP.

    Though as with SEO we have to look at other factors such as Age of site, and links to the site.

    A (earlier) analysis showed PP is way older ( :p ) and the link ‘quality’ is ‘better’ (based on http://thenexus.tk/how-to-get-the-best-links/ i.e. has more in post links) as opposed to my links which are mainly Social media links and/or from comments.

    So in theory due to the higher quality of links to PP it SHOULD be getting better PR / rankings (two separate things).

    So in conclusion hell yea sculpting works as PP the site that SHOULD be superior in % (in theory) is not.

    The Problem with the above analysis though is I am weighting PR 1/2/3/4/5 as the same not separately so in fact the number of higher PR vs lower PR pages if accounted for properly may result in a higher %…. I may do that later or I may just email you the in page PR analysis to work it out yourself ;)

    Saying all this the only way to truly test this hypothesis ofc is to let me at PP’s Themes code / admin panel and compare the results after the next PR update!

    (yes I do love unfounded baseless assumptions based on what looks like logical maths ! …kinda like advance physics :p)

  2. Donace | TheNexus 2009/05/11 10:12 am

    hmm speculation is fun; end of the day though if it works…dont break it :p (unless your bored and want to rebuild :p)

  3. Hey Donace, thanks for the great feedback! I actually like the idea of only indexing single posts and the home page. It seems like this would force link equity to be distributed among the pages that people will actually spend time on. This strategy seems like it would decrease the overall flow of traffic from Google et al, but I often wonder how effective impressions are over more targeted hits.

    Another thing that I would like to do is somehow prevent indexing of older posts — say, anything older than xxx days. It seems that Google already does this to a certain degree, but consolidating all of the lower ranking pages’ juice into the newer pages could have a positive impact.

    Interesting stuff to think about, as long you don’t get obsessed with it! ;)

  4. Joshua Richardson 2009/05/17 7:58 pm

    Unless you use nofollow on internal links you are not actually PR-sculpting/siloing.

    I use noindex on the non-meaningful pages on my websites, i.e. disclaimer, privacy policy etc.

    For the most part I nofollow external links, to keep any PR from leaving my site.

    Sometimes I will nofollow internal links too, i.e. client login area that doesn’t need to be indexed in google and I want to keep some of my internal PR.

    For the most part mucking around with PR doesn’t really matter unless you have a bigger website and are trying to place emphasis on individual pages for ranking purposes in google.

    There are alot of better ways to SEO your site first though, playing around with PR is probably one of the last tricks in the book you’d use.

    If you want to place more emphasis on newer pages one way to do this is via the priority field in your sitemap.xml file.
    I checked yours and noticed that every post has a priority of 1.0, you might want to have it so that the newest posts are 1.0, two weeks old are 0.90, and so on. As google uses this metric somewhat in determining what priority each page should have.

    If you wanted old posts to completely be removed from google, you could write a wordpress plugin that noindex’es posts that are over a certain age.

    Just a few of my opinions, but there’s alot you can really do (if you have the time to muck around that is).

  5. Thanks for the input, Joshua. Modifying my XML sitemap priority values is an excellent idea and something that needs to be done. I think I set that up around two years ago and wasn’t really too concerned about it. I didn’t think that Google actually paid any attention to the specified values.

    I suppose if anything I would like to consolidate more of PR on the home page while keeping my articles well-ranked. When I first started this site, I had a PR 6 on the home page, but only around 50 or so posts. As the number of posts continues to grow, it seems like my home PR just keeps decreasing, perhaps being diluted by all of the posts (nearly 700 now). Now at a 4, but still getting tons of great traffic.

  6. “So for now, rather than fighting it, I think I will just kick back, focus on creating quality content, and let Google take care of itself.”

    Thank you!
    I tweeted the same thing the other day- the best thing you can do is write quality content.

    I even sent a proposal to a prospective client that said pretty much the same thing- focus on quality content, the social web will pick you up, and don’t worry about Google. Make it easy to be indexed, but don’t write copy for the search engines. I think because I was telling her that and “SEO Masters” were saying “give us $3,000 and we’ll optimize your site” she decided to go against my advice, though.

    One of my pages is ranked #1 for a fairly popular topic (“Photoshop tilt shift”). I never bothered to optimize it, I never will. I still don’t know why it’s #1. And last I checked, it had a PR of 1.

  7. Jeff Starr 2009/05/18 3:58 pm

    Absolutely, George. If I had followed the crowd and spent all of my time working the social-media scene and clamoring for links, I may have had slightly higher PR for some of my pages, but there would have been no time for actual content.

    It always kills me to find a site that is brimming with rank but with nothing but crap for content. Besides, when it’s all said and done, you will either have learned more about your topic, improved as a writer, and built a site with high-quality content, or you will have traded all of that for a few more clicks up the food chain. For me, it’s a no-brainer: I hated the popularity contest in school and I hate it just as much on the Web.

    In any case, thanks for the comment, and congrats on the #1 article. Proof that PR and traffic do not necessarily predict one another.

  8. I noticed on some of themes that when applying the given script there would be a space after the initial title tag and before the title of the posting, for example: ” Post Title – Blog Title” I’m big on keeping concise code formatting, so something like this grinds my gears. Figure if someone else wants a solution to make it “Post Title – Blog Title” I went to wp-includes/general-template.php find: $prefix = ”; if ( !empty($title) ) $prefix = ” $sep “; and remove the spaces around $sep. There may be other ares of this occurance that its should be fixed.

    For a php solution without messing with wp core files, I tried a few simple functions but none were successful. A good question would be if SEO is even affected by having spaces before the start of a title tag?

  9. hey Jeff,
    I found you on google. I was looking for something a little different, but I thought I’d mention that I just read a Matt Cutt’s (a google engineer) blog post on “page rank sculpting” and the bottom line was don’t bother with it. As you said, it’s best just to let google do what it does. :)
    I’d put a link to it but I don’t want to get “moderated”, but it’s on his blog..
    You can fix this link if you want
    www DOT mattcutts DOT com/blog/pagerank-sculpting/
    ~ steve booth

  10. Jeff Starr 2009/06/17 9:45 am

    Thanks for the info, steve – I did catch that article from Matt and managed to jump in on the conversation fairly early in the game (even got a reply from Mr. Cutts himself – woo hoo!). Cheers, Jeff

  11. For some reasons, I like the idea indexing and following.
    I don’t understand much about why you don’t want google crawling your page. I think it’s just security but I think it’s not the point.
    I’m using the meta index and follow and it work well. My site receives so many traffics.

  12. study8home 2010/01/22 2:54 am

    Fabulous Article,
    I like the information on experimenting with SEO tips by google,This is important as far as SEO is concerned.
    well thanks for such a wonderful information.
    keep doing the good work
    Looking for more from you.
    Thanks

Comments are closed for this post. Something to add? Let me know.
Welcome
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
SAC Pro: Unlimited chats.
Thoughts
Went out walking today and soaked up some sunshine. It felt good.
I have an original box/packaging for 2010 iMac if anyone wants it free let me know.
Always ask AI to cite its sources.
All free plugins updated and ready for WP 6.6 dropping next week. Pro plugin updates in the works also complete :)
99% of video thumbnail/previews are pure cringe. Goofy faces = Clickbait.
RIP ICQ
Crazy that we’re almost halfway thru 2024.
Newsletter
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.