credit-card-on-computer

Back in August the team at Zoompf published a joint research study with Moz analyzing How Website Speed Actually Impacts Search Ranking. In this research, a surprise result showed no clear correlation between page load time and search ranking. This confounded us, since we expected to see at least some small measure of correlation, especially after Google announced in 2010 that site speed would have a partial impact on search ranking. We did, however, observe a correlation between "Time to First Byte" and search ranking, and we delved into more detail in our follow-up post.

In these two articles, it was noted by our readers that while page load time may not appear to directly impact search ranking, it still has an obvious impact on user experience and will likely have an increasing impact on search ranking in the future. In other words, page load time should still be considered a priority to the success of your site.

But how big of a priority is it really? Of course it depends: The slower your site is now, the greater your user experience lags behind your competitors. Additionally, the more traffic your site receives, the more benefit you'll receive from performance optimization (we'll dig into that more below).

The good news is that, unlike the impact on search ranking, there is a wide body of independent research showing clear causation between improved site performance and increased conversion rates, user engagement, and customer satisfaction. It also just makes sense—we've all visited slow websites, and we've all bailed out when the page takes too long to load. On mobile we're even less patient.

What may be surprising, though, is just how big of an impact a slow performance can have on your conversions. Let's look at that first.

The research

research_books

Back in 2006, Amazon presented one of the first studies linking a clear causation between page load time and online customer revenue, summarized in Greg Linden's presentation Make Data Useful. Through A/B testing, Greg showed every 100 millisecond delay in page rendering time resulted in a 1% loss of sales for Amazon.

In more recent research, Intuit presented findings at Velocity 2013 from their recent effort to reduce page load time from 15 seconds to 2 seconds. During that effort, they observed a dramatic increase in conversions for every second shaved off their page load time, in a stair step that decreased with increasing speed. Specifically:

  • +3% conversions for every second reduced from 15 seconds to 7 seconds
  • +2% conversions for every second reduced from seconds 7 to 5
  • +1% conversions for every second reduced from seconds 4 to 2

So in other words there was tremendous value in the initial optimization, and diminishing value as they got faster.

In another recent report, Kyle Rush from the 2011 Obama for America campaign site showed through A/B testing that a 3-second page time reduction (from 5 seconds to 2 seconds) improved onsite donations by 14%, resulting in an increase of over $34 million in election contributions.

In fact, there's a wide body of research supporting clear economic benefits of improving your site performance, and clearly the slower your site is, the more you have to gain. Additionally, the higher your traffic, the larger the impact each millisecond will yield.

How fast should I be?

Whenever we talk with people about web performance, they always want to know "How fast should I be?" Unfortunately this one is hard to answer, since the result is subjective to your business goals. Those in the performance industry (of which, full disclosure, Zoompf is a member) may push you to hit two seconds or less, citing research such as that from Forrester showing that 47% of users expect pages to load in two seconds or less.

We prefer a more pragmatic approach: You should optimize to the point where the ROI continues to makes sense. The higher your traffic, the more monetary difference each millisecond gained will make. If you're Amazon.com, a 200-ms improvement could mean millions of dollars. If you're just launching a new site, getting down to 4-6 seconds may be good enough. Its really a judgment call on your current traffic levels, where your competition sits, your budget, and your strategic priorities.

The first step, though, is to measure where you stand. Fortunately, there's a great free tool supported by Google at WebPageTest.org that can measure your page load time from various locations around the world. If you receive a lot of international traffic, don't just select a location close to home—see how fast your site is loading from Sydney, London, Virginia, etc. The individual results may vary quite a bit! WebPageTest has a lot of bells and whistles, so check out this beginner's guide to learn more.

Where do I start?

Improving the performance of your site can seem daunting, so it's important you start with the low hanging fruit. Steve Souders, the Head Performance Engineer at Google, has famously stated:

"80-90% of the end-user response time is spent on the front-end. Start there."

This has come to be called the Performance Golden Rule. In layman's terms, this means that while optimizing your web server and database infrastructure is important, you will get a higher return on your time investment by first optimizing the front-end components loaded by your users' browsers. This means all the images, CSS, JavaScript, Flash and other resources linked as dependencies from your base HTML page.

You can see the Performance Golden Rule well illustrated in a typical waterfall chart returned by tools like WebPageTest. Note how the original page requested is a very small subset of the overall time. Generating this original base page is where all the back-end server work is done. However, all the other resources included by that page (images, CSS, etc.) are what take the large majority of the time to load:

waterfall_frontend

So how can you speed up your front-end performance and reap the rewards of a better user experience? There are literally hundreds of ways. In the sections below, we will focus on the high-level best practices that generally yield the most benefit for the least amount of effort.

Step 1: Reduce the size of your page

Bloated content takes a long time to download. By reducing the size of your page, you not only improve your speed, you also reduce the used network bandwidth for which your hosting provider charges you.

An easy optimization is enabling HTTP compression, which can often reduce the size of your text resources (HTML, CSS, and JavaScript) by 50% or more. WhatsMyIP.org has a great free tool to test if compression is turned on for your site. When using, don't just test the URL to your home page, but also test links to your JavaScript and CSS files. Often we find compression is turned on for HTML files, but not for JavaScript and CSS. This can represent a considerable potential performance boost when your server is configured for compression properly. Keep in mind, though, you do NOT want your images to be compressed by the server as they are already compressed. The extra server processing time will only slow things down. You can learn more in this detailed guide on what content you should compressing on your website.

If you find your server is not using compression, talk to your server admin or hosting provider to turn it on. Its often a simple configuration setting, for example see the mod_deflate module for Apache, IIS 7 configuration docs, or this article on enabling on WordPress sites.

In addition, images can often contribute to 80% or more of your total page download size, so its very important to optimize them as well. Follow these best practices to cut down your image size by 50% or more in some cases:

  • Don't use PNG images for photos. JPEG images compress photographs to significantly smaller sizes with great image quality. For example, on Windows 8 launch day, the Microsoft homepage used a 1 megabyte PNG photograph when a visually comparable JPEG would have been 140k! Think of all the wasted bandwidth on that one image alone!
  • Don't overuse PNGs for transparency. Transparency is a great effect (and not supported by JPEG), but if you don't need it, you don't always need the extra space of a PNG image, especially for photographic images. PNGs work better for logos and images with sharp contrast, like text.
  • Correctly set your JPEG image quality. Using a quality setting of 50-75% can significantly reduce the size of your image without noticeable impact on image quality. Of course, each result should be individually evaluated. In most cases your image sizes should all be less than 100k, and preferably smaller.
  • Strip out extraneous metadata from your images. Image editors leave a lot of "junk" in your image files, including thumbnails, comments, unused palette entries and more. While these are useful to the designer, they don't need to be downloaded by your users. Instead, have your designer make a backup copy for their own use, and then run the website image versions through a free optimizer like Yahoo's Smush.It or open source tools like pngcrush and jpegtran.

Lastly, another good way to reduce your page size is to Minify your Javascript and CSS. "Minification" is a process that strips out the extra comments and spaces in your code, as well as shortening the names of functions and variables. This is best seen by example:

Example: Original Javascript

 /* ALERT PLUGIN DEFINITION
  * ======================= */
  var old = $.fn.alert
  $.fn.alert = function (option) {
    return this.each(function () {
      var $this = $(this)
        , data = $this.data('alert')
      if (!data) $this.data('alert', (data = new Alert(this)))
      if (typeof option == 'string') data[option].call($this)
    })
  }
  $.fn.alert.Constructor = Alert

Minified Version (from YUI Compressor):

var old=$.fn.alert;$.fn.alert=function(a){return this.each(function(){var c=$(this),b=c.data("alert");if(!b){c.data("alert",(b=new Alert(this)))}if(typeof a=="string"){b[a].call(c)}})};

Your minified pages will still render the same, and this can often reduce file sizes by 10-20% or more. As you can see, this also has the added benefit of obfuscating your code to make it harder for your competitors to copy and modify all your hard earned work for their own purposes. JSCompress is a basic easy online tool for Javascript, or you can also try out more powerful tools like JSMin or Yahoo's YUI compressor (also works for CSS). There's also a useful online version of YUI which we recommend.

Step 2: Reduce the number of browser requests

The more resources your browser requests to render your page, the longer it will take to load. A great strategy to reduce your page load time is to simply cut down the number of requests your page has to make. This means less images, fewer JavaScript files, fewer analytics beacons, etc. There's a reason Google's homepage is so spartan, the clean interface has very few dependencies and thus loads super fast.

While "less is more" should be the goal, we realize this is not always possible, so are some additional strategies you can employ:

  • Allow browser caching. If your page dependencies don't change often, there's no reason the browser should download them again and again. Talk to your server admin to make sure caching is turned on for your images, JS and CSS. A quick test is to plug the URL of one of your images into redbot.org and look for the header Expires or Cache-Control: max-age in the result. For example, this image off the eBay home page will be cached by your browser for 28,180,559 seconds (just over 1 year).

expires_header2

Cache-Control is the newer way of doing things, but often times you'll also see Expires to support older browsers. If you see both, Cache-Control will "win" for newer browsers.

While browser side caching will not speed up the initial page load of your site, it will make a HUGE difference on repeat views, often knocking off 70% or more of the time. You can see this clearly when looking at the "Repeat View" metrics in a WebPageTest test, for example:

broswer_caching

  • Combine related CSS and JS files. While numerous individual CSS and JS files are easier for your developers to maintain, a lesser number of files can load much faster by your browser. If your files change infrequently, then a one time concatenation of files is an easy win. If they do change frequently, consider adding a step to your deploy process that automatically concatenates related groups of functionality prior to deployment, grouping by related functional area. There are pros and cons to each approach, but there's some great info in this StackOverflow thread.
  • Combine small images into CSS sprites. If your site has lots of small images (buttons, icons, etc.), you can realize significant performance gains by combining them all into a single image file called a "sprite." Sprites are more challenging to implement, but can yield significant performance gains for visually rich sites. See the CSS Image Sprites article on w3schools for more information, and check out the free tool SpriteMe.

Step 3: Reduce the distance to your site

If your website is hosted in Virginia, but your users are visiting from Australia, it's going to take them a long time to download your images, JavaScript and CSS. This can be a big problem if your site is content-heavy and you get a lot of traffic from users far away. Fortunately, there's an easy answer: Sign up for a Content Delivery Network (CDN). There are many excellent ones out there now, including Akamai, Amazon CloudFront, CloudFlare and more.

CDN's work basically like this: you change the URL of your images, JS and CSS from something like this:

https://mysite.com/myimage.png

to something like this (as per the instructions given to you from your CDN provider):

https://d34vewdf5sdfsdfs.cloudnfront.net/myimage.png

Which then instructs the browser to look out on the CDN network for your image. The CDN provider will then return that image to the browser if it has it, or it will pull it from your site and store for reuse later if it doesn't. The magic of CDNs is that they then copy that same image (or javascript or CSS file) to dozens, hundreds or even thousands of "edge nodes" around the world to route that browser request to the closest available location. So if you're in Melbourne and request an image hosted in Virginia, you may instead get a copy from Sydney. Just like magic.

To illustrate, consider the left image (centralized server) vs. the right image (duplicated content around the world):

In closing

While front-end performance does not currently appear to have a direct impact on search ranking, it has a clear impact on user engagement and conversions into paying customers. Since page load time also has a direct impact on user experience, it is very likely to have a future impact on search ranking.

While there are many ways to optimize your site, we suggest three core principles to remember when optimizing your site:

  1. Reduce the size of your page
  2. Reduce the number of browser requests
  3. Reduce the distance to your site

Within each of these, there are different strategies that apply based on the makeup of your site. We at Zoompf have also introduced several free tools that can help you determine which areas will make the biggest impact, and we also support a free tool to analyze your website for over 400 common causes of slow front-end performance. You can find them here: https://zoompf.com/free.

Happy hunting!