Spring Sale! Save 30% on all books w/ code: PLANET24
Web Dev + WordPress + Security

Optimizing Google Analytics Performance

[ Image: Global Map Icon ] It has occurred to me lately that I no longer use Google Analytics for Perishable Press. Instead, I find myself keeping an eye on things using Mint almost exclusively. So, the question now is: do I continue serving the GA JavaScript to keep the profile active just in case I ever need the additional stats? I mean, Mint already does a great job at recording all of information I could ever need, so I no longer see the use for Google Analytics. I do wonder, however, if Google ranks GA-enabled sites a bit higher than non-GA sites. Hmmm.. it seems to me that there are several options going forward..

Option 1: Continue with Google Analytics

The easiest thing for me to do at this point would be to just leave it alone: continue serving the extra 6.3K/21.4K GA JavaScript (i.e., urchin.js) to site visitors. Sure, I may never actually use the volumes of data I am collecting via Analytics, but hey, who cares, right? Of course, delivering the urchin.js file requires bandwidth and other server resources, and also tends to slow things down a bit, especially on those rare occasions when the Google server bogs down.

Option 2: Stop using Google Analytics

If I decide to discontinue Google Analytics altogether, I would need simply to edit my footer.php file to remove the GA code. If I do this, my site will require fewer resources, consume less bandwidth, and enjoy faster loading times. Increased site performance is a huge factor for me. Anything I can do in order to optimize performance and enhance the user experience is at the top of my list. Plus, not everyone enjoys being tracked statistically, nor does everyone surf the Web with JavaScript enabled/available. At this point, dropping GA entirely is looking better and better..

Option 3: Serve the urchin.js file locally

One of the main reasons why killing Analytics seems so attractive is the (slightly) faster page-loading times that would be enjoyed without it. Thus, a third option for dealing with my currently unused GA setup involves optimizing performance by serving the requisite urchin.js file locally. The simplest way to do this is to download a copy of the file to your server and call it directly by changing the standard code generally located in your footer:

<!-- notice the address change in the next line to serve urchin.php locally -->
<script src="https://perishablepress.com/stats/urchin.js" type="text/javascript"></script>
<script type="text/javascript">
	_uacct = "UA-332938-1";

This would be ideal, if it weren’t for the fact that Google periodically changes the urchin.js file. They do this infrequently enough (approximately once per year) that manually updating a couple of times a year would be less work than actually remembering to do it.

Option 4: Run a script to cache urchin.js locally

To host the file locally without relying on my frail human memory to update it, I could implement a PHP script to cache a local copy and periodically check for updates. Seriously considering this option, I began researching for code snippets that would help me fashion such a script. As it turns out, however, an engineering student from India named Joyce Babu wrote a script for this exact same thing several months ago:

<?php // source: http://www.joycebabu.com/blog/speed-up-google-analytics-using-simple-php-script.html

// Remote file to download
$remoteFile = 'http://www.google-analytics.com/urchin.js';
// Local File name. Must be made writable
$localFile = "local-urchin.js";
// Time to cache in hours
$cacheTime = 24;
// Connection time out
$connTimeout = 10;
// Use Gzip compression
$useGzip = true;
if(file_exists($localFile) && (time() - ($cacheTime * 3600) < filemtime($localFile))){
     $url = parse_url($remoteFile);
     $host = $url['host'];
     $path = isset($url['path']) ? $url['path'] : '/';
     if (isset($url['query'])) {
          $path .= '?' . $url['query'];
     $port = isset($url['port']) ? $url['port'] : '80';
     $fp = @fsockopen($host, '80', $errno, $errstr, $connTimeout ); 
          // On connection failure return the cached file (if it exist)
          // Send the header information
          $header = "GET $path HTTP/1.0\r\n";
          $header .= "Host: $host\r\n";
          $header .= "User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv: Gecko/20070725 Firefox/\r\n";
          $header .= "Accept: */*\r\n";
          $header .= "Accept-Language: en-us,en;q=0.5\r\n";
          $header .= "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n";
          $header .= "Keep-Alive: 300\r\n";
          $header .= "Connection: keep-alive\r\n";
          $header .= "Referer: http://$host\r\n\r\n";
          fputs($fp, $header);
          $response = '';
          // Get the response from the remote server
          while($line = fread($fp, 4096)){
               $response .= $line;
          // Close the connection
          fclose( $fp );
          // Remove the headers
          $pos = strpos($response, "\r\n\r\n");
          $response = substr($response, $pos + 4);
          // Return the processed response
          echo $response;
          // Save the response to the local file
               // Try to create the file, if doesn't exist
               fopen($localFile, 'w');
          if(is_writable($localFile)) {
               if($fp = fopen($localFile, 'w')){
                    fwrite($fp, $response);

Simply upload his script and edit your analytics code to call for it instead of calling for the urchin.php file itself. The script then downloads a fresh copy of urchin.php at the specified time interval. The upside is that the script eliminates the need to manually remember and replace the file, however the downside is that the script runs for every page load. If it weren’t for that, this would be the ideal solution. Joyce even created an enhanced WordPress plugin to simplify the entire process.

Option 5: Invoke the powers of cron

As mentioned, the only downside to running a caching script to keep a locally served urchin.php file fresh and delicious is the perpetual processing of the script itself. Fortunately, we may call upon the mighty powers of cron to schedule automatic script execution at specified intervals. Using a cron job to execute the urchin.php-update script ensures file freshness without having to constantly run the script — once a week (or however often you choose) does the job. In measured-memory environments or on setups with other resource restrictions, conserving processing power via cron is ideal. Simply upload the script and create a cron job as follows:

/usr/local/bin/php /home/username/path/to/script.php >/dev/null

..or, alternately (in cPanel):

/usr/local/bin/php -q /home/username/path/to/script.php

After setting up this command with the desired processing interval, the update script will quietly keep your copy of urchin.php updated with the current version. Finally, if using this method, remember to call your local copy of urchin.php (instead of the update script itself) as per the code presented in “Option #3”, above.

Option 6: Upgrade to the new Google Analytics

I can’t believe I missed this during my initial write-up. As astute reader Louis points out, Google has completely upgraded its Analytics program, including a complete rewrite of the original JavaScript used in the omnipresent urchin.js script. According to reports, the new code — ga.js — is leaner, faster and smarter, capable of event tracking, cross-browser caching, and object-oriented functionality. In short, it rocks. Although I have not yet had the opportunity to fully investigate this new scriptness, it may very well be the solution for which I am looking — locally hosting and automatically upgrading the new ga.js file may not even be necessary given the optimized nature of the script. Enough for now, here is the complete code required to start using the new Google Analytics this very moment:

<script type="text/javascript">
     var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
     document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
<script type="text/javascript">
     var pageTracker = _gat._getTracker("UA-XXXXXXX-X");

A little messy, but hey, who cares, right? Note that it is critical to use either the old GA code or the new GA code, but not both. Simply replace your previous Google Analytics code with this new version and you are good to go! Don’t forget to edit the “UA-XXXXXXX-X” to reflect your actual GA account number.

Da wrapz..

As far as I am able to tell, these are the most reasonable solutions for optimizing and/or dealing with your (possibly unused) Google Analytics configuration. The important thing here is to be mindful of your involvement with GA — are you actually benefiting from Analytics, or has it fallen into disuse, like that old exercise bike gathering dust in the basement? If you use it, optimize it. If not, why waste resources at all? Although I remain undecided at this point, chances are high that Google Analytics will disappear completely during the next site redesign.

What about you? Are you using Google Analytics, and if so, how often do you examine the results? Do you use other statistical applications as well? What is your strategy for statistics?

About the Author
Jeff Starr = Fullstack Developer. Book Author. Teacher. Human Being.
GA Pro: Add Google Analytics to WordPress like a pro.

7 responses to “Optimizing Google Analytics Performance”

  1. In this post, you talk about urchin.js, so I wonder whether you are aware of the existence of the new ga.js script ?

    On performance, my tests showed that Google was able to serve the script faster than my own server. Plus, The url http://www.google-analytics.com/ga.js is the same when the visitor surfs on two sites in a row, that are using GA. Therefore the ga.js script can be cached between sites, and a user going on your site after having been on another using GA, already has the file cached.

  2. Perishable 2008/02/05 1:02 am

    Wow! Where have I been hiding? I had no idea that such an improvement had been made to the Google Analytics script. After reading your comment, I spent some time investigating the new GA, and must say that I am impressed. The script is minified and the cross-browser caching is a huge improvement. Although I have not yet used this method, I am going to update this article based on my current findings. The new GA script seems vastly better than urchin, however I find the inclusion code remarkably hideous, to say the least (of course, I am nitpicking here).

    Thanks for the heads up, Louis!

  3. I though the same about the inclusion code, so I went with a classic script inclusion, nothing fancy.

    Also, I’m very happy to teach you a little something, after all you’ve brought me.

  4. A tiny extra optimization you can do, if you are completely sure your site will NOT use SECURE SSL pages, is call the JS directly as:

    <script src="http://www.google-analytics.com/ga.js" type="text/javascript"></script>

    instead of letting the script detect if you are in HTTPS or not.

    Best regards

  5. Perishable 2008/06/29 2:11 pm

    Excellent tip, Pablo — thanks for sharing! I will be using this trick in the future, as it looks like it may help improve performance a bit. Btw, I consolidated your posts with the correct code.

  6. free games 2008/10/09 5:07 am

    Found this (untested) for those that just want https

    For HTTPS:

    var pageTracker = _gat._getTracker("UA-XXXX-1");

    My Yslow has improved (marginally) but the pages are quicker.

  7. Very nice! Thanks for the tip! :)

Comments are closed for this post. Something to add? Let me know.
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
.htaccess made easy: Improve site performance and security.
I live right next door to the absolute loudest car in town. And the owner loves to drive it.
8G Firewall now out of beta testing, ready for use on production sites.
It's all about that ad revenue baby.
Note to self: encrypting 500 GB of data on my iMac takes around 8 hours.
Getting back into things after a bit of a break. Currently 7° F outside. Chillz.
2024 is going to make 2020 look like a vacation. Prepare accordingly.
First snow of the year :)
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.