As I'm sure many of you well know, page speed has become a factor for ranking in the search engines. Although, for the engines it's not a make or break factor but I've always felt that it is for visitors.
With a huge number of people using super fast broadband (and it's growing), patience for slow loading websites is being to fray. I personally hate waiting for a page to load and I'm sure my visitors do as well.
That's why if you're investing lots of time and money into getting the rankings you deserve, you shouldn't be blowing it by having a slow website.
That's why I want to share some practical advice I've learnt from optimising my websites loading times. If you don't have a website development background don't worry, I'll keep everything clear (but if not, feel free to send it to your website team). So let's get started.
Tools to Use
As every SEO knows, tools makes our lives easier. And in this situation, it's no different. The easiest tool(s) I've found are the YSlow and Page Speed Plugins for Firefox/Firebug.
- You can download Firebug via - https://addons.mozilla.org/en-US/firefox/addon/firebug/
- You can download YSlow via - https://addons.mozilla.org/en-US/firefox/addon/YSlow/
- You can download Page Speed via - https://code.google.com/speed/page-speed/download.html
Once you have these installed then you're good to go. In Firefox go to the website that you're wanting to optimise. Click on the Firebug icon in your add on bar (shown below) to open up the firebug console. Then click on the Page Speed tab and then the Analyse Performance Tab. In a few seconds you'll get a score out of 100. The aim; get as close to 100 as possible. Most likely you'll already be around the 70/80 mark.
Using these tools I've taken one of my wall clock sites to a good 90/100 and much faster loading times.
The handy thing about the Page Speed plugin is that it will give you additional information if you click into each of the sections and each section is colour coded as to how good or bad it is. Red bad, yellow ok, green great. Easy enough.
What You Can Do
Some of the most common places I see that people don't optimise are below. I'll go through each one and explain how you can fix it.
Minify CSS
CSS (cascading style sheets; the files that make your website pretty) is a commonly overlooked one, for any reasonable site, CSS files can have a couple of hundred to tens of thousands of lines of code. Each tab, space, additional comma, line break & code comment adds to your file size. Although they make it easier for the people working in the code, it can slow your website down.
The solution to this is to keep an original copy for the developers to work in and then have them minify (remove all the unnecessary bits) the code for the live website. You can expect a 20-30% saving on average. Which on large files can help a lot!
Page Speed has a built in compressor, but you can also use a tool like CSSminifier.
On the maximum compression you can achieve much higher savings.
Minify JavaScript
Again, many people forget to compress their JavaScript, all those spaces, etc add up and on large files can use lots of space. Even more so these days now that JavaScript libraries such as jQuery and mooTools have become widespread.
PageSpeed also has a built in compressor with the saving shown.
Minimise File Requests
Each request that your website makes to the server to get a file the more it will slow the loading of the page down. So the more requests you make the slower things will be.
When you (or your team) comes to building a website, think about where you can avoid using images for effects and instead use CSS. Just a few example are gradients, buttons, rounded corners and many more. All of which can be achieved using CSS.
This also includes the requests you make for CSS files and JavaScript files. Consider the following:
You can see that there are many requests for JavaScript files here. All of which can be combined into a single file that is requested once.
We'll talk more about minimizing file requests for images in CSS Sprites.
Leverage Browser Caching
Not only can you minimize files but you can make the users browser cache (or download and use a local copy) of your files rather than reloading them each time. This saves on requests and also makes loading for returning visitors quicker.
Check out Google's page on caching - https://code.google.com/speed/page-speed/docs/caching.html
Minimize Redirects
Redirections take time. The more you have, the longer it takes your user to get to the page you are redirecting them to.
Avoid using them where ever you can.
Optimise Images
This is one that is also commonly overlooked. If your website has lots of images or large images then they will take time to load. Knowing what file type to save your image as is half the battle. In Photoshop or Fireworks, the export wizard for images will tell you what the file size will be in bytes or kilobytes.
Simple images tend to be smaller in the PNG format, whereas more complex images tend to be better in JPG. There is also varying levels of quality that you can export. Why export at the maximum quality of JPG if there's no noticeable difference between that and medium.
Going through your pre-existing images may take a while but the savings help. Especially if you have a popular website and are paying for bandwidth.
There is also a saving to be had by exporting images using Fireworks over Photoshop. This article proves the point perfectly (https://webdesignerwall.com/general/fireworks-vs-photoshop-compression).
But just to save you the time, here's the sum up (this was just his background image):
"If I export the background with Fireworks, I can reduce its file size by 20 kb. I get about 16,000 visits per day on average. 20 kb x 16, 000 = 320, 000 kb. Yes, that is 320 megabytes per day!"
Avoid Bad CSS
CSS is easy to learn but very hard to master, there are some great websites out there for learning how to write better CSS that's more efficient that you or your developer should read. I can almost guarantee that everyone has something to learn.
A fantastic site for learning more about CSS is CSS Tricks but you will find hundreds if you do a quick Google search.
Enable Compression
You can also enable Gzip or deflate on your server. This will reduce the size of html files, css, etc that are being sent to your visitor. Once again, smaller file sizes = quicker loading times.
You can learn more about compression here (https://code.google.com/speed/page-speed/docs/payload.html#GzipCompression)
Use A CDN
Content Distribution Networks are a fantastic and cheap way to display media on your website. Rather than have your website send the images for example to the visitor, they load them instead from a CDN such as Amazon's S3 service. They take the load off your server allowing it to serve more visitors.
CDNs are more for medium to larger sized websites and can make all the difference if you are serving tens/hundreds of thousands of visitors a day.
Use A Caching System
Almost all websites these days use databases. Ecommerce websites and blogs are prime examples of this. As we discussed before, every request you make to a server, slows the loading of a page down and it's exactly the same with databases. Every time you load a page, the information is being requested from the database then returned to the visitor.
Servers can only handle so many requests per second before they die under the strain of trying to give everyone the information that they have requested. And that's where caching systems come into play.
Rather than request the information every time a person opens a page, a caching system will call the information once every hour (for example) and "save" the results. Each visitor will get the saved version until it updates again. This is in principle the way in which sites like Facebook deal with the millions of requests their databases get every second.
This tactic is usually only needed for medium to larger websites however if you're using pre built packages such as Wordpress or Magento then you can implement caching very easily.
Wordpress
My favourite caching plugin for Wordpress is hyper cache. It's a five minute install/configuration and will help you to deal with lots of traffic or spikes of traffic you may get if your content gets featured on sites like Digg. The link is https://wordpress.org/extend/plugins/hyper-cache/
There are also many other caching plugins for Wordpress that work in a similar way that are easily found with a quick search.
Ecommerce
Most ecommerce packages come with caching built in. Magento for example has a very good built in cache system that works from the go. If you have built your own ecommerce software and are finding that it loads very slowly then you should consider implementing a caching system with memcache for example.
Use CSS Sprites
This is perhaps my favourite little trick of all. CSS sprites are a fantastic way to not only cut down on the number of image requests that you make but also to reduce the overall size of the images.
The concept is simple and elegant. Use one image as a "template" then select only a small section of the template.
The above is a CSS sprite. This one image acts as a template for a lot of buttons, nav headings, icons etc on the website. By using the sprite, the number of requests drops and so does the overall file size.
Pretty cool huh?
Write Good Code
This one may seem obvious but writing clean code that doesn't unnecessarily repeat itself can make your website much quicker. If you have custom software then it may well be worth moving to a prebuilt solution or having the system rebuilt if none of the above speeds up your site.
Poorly optimised code in the backend will certainly slow a site down no matter how good your frontend efforts are.
If That Isn't Enough
So, you have done all the above and your site still loads slow. What else can you do?
Check Your Hardware
If you are trying to run a ecommerce store with 20,000 products on it on a single shared host then this may well be your problem. The server hardware and configuration you have may well be the cause of your slow speed.
It may not be able to keep up with what you're throwing at it day to day. Upgrading your server to a dedicated machine or one that is more powerful may well be the solution. It may also be an option to buy more servers and split up the loads. I.e. One server for files, one server for the database, etc.
Update your software (e.g.: update Wordpress)
If you are using pre-built software such as Wordpress then you may just need to update. Quite often developers will have written much more efficient code and solved a lot of the problems. If you're using an old version then you won't be benefiting from their improvements.
Please use caution when you do upgrade though, test the upgrade first on a separate server. Don't just upgrade your live website and hope nothing breaks, because more often than not...it will break.
Closing Thoughts
I hope that this has given you a taster of how to optimise your websites loading speed. This isn't a comprehensive guide so I'm sure there are things that haven't been included that could certainly help you.
My advice is to simply work down the list on Page Speed, try to get as much as possible correct. You'll never hit 100/100 (believe me I've tried - my closest was 98/100) but it will certainly give you a path to follow.
For those who have paid attention this long, you may wonder, why haven't I talked about YSlow. Both YSlow and PageSpeed perform a similar task, both can offer extra little insights if you dig into the tabs inside of them. YSlow also shows a little loading time in the add on bar which is handy while you're trying to improve your speed.
I wish you all the best in your endeavours!
What Works For You?
I'd also love to hear what has or hasn't worked for you in getting your loading times down. It would be great if we could get different insights from everyone else!
Great post Ed! Thanks for sharing all your insight. Even at SEOmoz, we are working with exactly these same challenges. As website pages grow increasingly complex the importance of page speed optimization grows even greater. Some folks downplay the importance of page speed in Google rankings - and they are probably right - but the damage done to user experience and conversion rates by slow loading websites can't be ignored.
Thanks very much! I agree completely, slow sites usually lead to bad conversion rates. I've experienced it myself.
My eCommerce software of choice tends to be slow out of the box (one of the reasons I learnt about optimising page speeds) meaning my bounce rates were huge and conversions were low. After optimising and moving to a slightly better server, things turned around. So I'm convinced that speed is important and spending some time to fix any niggling issues is time well spent.
I've had good results with the WordPress plugin W3 Total Cache as well.
For a page speed tool, I'd highly recommend Pingdom
It's a fantastic free tool that not only gives you the normal load times and waterfalls, but also allows you to save your tests for future comparisons which is handy.
Pingdom has also proved to be very accurate, compared to the silly results given by the Google Speed Test tool, for example.
Pingdom is great, I second that.
Nice breakdown on what often end up being pretty simple fixes. I'm shocked how often I still see sites with 200KB JPEGs that could easily be 20-40KB with almost no loss of quality, or GIFs/PNGs that should be JPEGs (and vise-versa).
I love nothing more than a well detailed blog post. Thanks for sharing these tips @edbaxter
I must say, since Panda, I've taken even the smallest things and made a big effort to resolve them, but something I've never scimped on is site speed, as you said, from an end user point of view, a slow site is awful, but even before Panda and Google taking site speed seriously, I've seen better ranking results for sites once they've be optimized for speed.
Its great to see that Ed's post made it to the main blog. This is how good youmoz posts should be.
Great work and congrats Ed.
Loving the CSS sprites - how cool is that.
another thing that often gets underestimated is the reduction of server side code. think about any wordpress theme, for example: there are a lot of php functions generating the same output along the years... all that php code is executed every time, while it could be just substituted with the final html code.
think about pieces of code like these:
all this code can be changed to the final html code, saving a lot of php calls, leading to huge improvement in server performances...
time ago i had to deal with a wordpress website having an average of 20.000 visits and 150.000 pageviews per day... the (dedicated) server was struggling in handling all the traffic, but simply removing unnecessary php code totally changed the situation...
Page speed is definitely one of the factors that Google takes into account while ranking websites. The reason is user experience. If users have to wait for a website to load then its a problem.
A great post.
I think that this post deserves to be promoted to the main blog.
A small factor as far as ranking but for sure it will improve the overall performance and you can get more from the visitors / users ..There is a good webmaster tool video on this and also independent tests has shown that taking care of the speed/loading factor will improve the bounce rate.
We've gotten to used to coding for high-speed users who thought that search engines again would be looking into this. I often find that CSS sprites are the easiest way to control speed when loading websites. Its a little tricky to figure out at first, but once you get the hang of it, it goes pretty easily.
My compliments... really handful... just one thing: consider to create a 1 page check list of all this information. That would make it even more practical.
I've found that refreshing the page speed plugin while working is the best checklist. Each time you make a change to your website it will let you know how you are doing and you can simply work down the list.
Good timing with this post. Today Google Analytics released a way to track page load time. All you have to do is add a line of code:
https://analytics.blogspot.com/2011/05/measure-page-load-time-with-site-speed.html
rather than have mod deflate gzip css or js files for each user I manually zip them up using 7zip and serve them with
<FilesMatch .*\.js.gz$>ForceType application/javascriptHeader set Content-Encoding: gzip</FilesMatch><FilesMatch .*\.css.gz$>ForceType text/cssHeader set Content-Encoding: gzip</FilesMatch>RewriteCond %{HTTP:Accept-encoding} gzipRewriteCond %{HTTP_USER_AGENT} !SafariRewriteCond %{REQUEST_FILENAME}.gz -fRewriteRule ^(.*)$ $1.gz [QSA,L]
Sprites are good for 1 page elements but site wide use data uri (more work but worth it), that way you can place generated code within your css and have it cached.
Software I use
https://pnggauntlet.com/
Has anyone found a good way to pull the Google Analytics Site Speed data into an excel spreadsheet and use this to track the speeds of a multitude of sites that you own?
Great Article. Being SEO Consultant its really hard to get work done from web designer. I started recommending page speed to all my clients couple of years ago. This article will make things simple for web designers (hopefully). I am sending the links to all web designer work with us.
Thanks again.
Very well written and easy to follow article.
Let me add that we are experimenting with CSS gradient background images and have experienced drastically reduced loading times and less HTTP requests. We are using a free code generator tool located at: https://www.colorzilla.com/gradient-editor/
In addition we use mod_pagespeed integrated web hosting at: https://www.mod-page-speed.com
They are strict about user subscription (No WP or resource intensive apps) in the end, it benefits all of their web hosting clients.
Thanks for the superb post! I was looking for such a post and will now act immediately as per things covered in the post.
Thanks for the insight.
This would make a great presentation at a conference! I love walking out of a room with key actions to take, tools to research and testing to be done. Thanks for sharing.
A useful stuff you have shared here. Optimising speed for w website is a matter to devolop, particularly when you you are coding for clients websites. Very technical ans so simple.
Thanks for the links. I have heard about YSlow recently but hadn't gotten around to figuring it out. Your screenshot of Page Speed sold me, though. It tells me what % of CSS wasn't used on that page?! It tells me I can optimize images and even optimizes images for me and lets me save a copy? Now that is FANTASTIC.
YSlow is an interesting one. Some of its recommendations are extremely useful. Others not so much. For example, one of the points it makes is that Javascript should be at the bottom of any code, so that it loads after the page has been seen. That's not always a good solution to other problems, though. I think that as with anything you need to use common sense to work out whether the tool is telling you something useful about a single aspect, or holistically.
Interestingly, SEOMoz blog posts only get a "C" grade, partly because they use a fair amount of Javascript in the head rather than at the bottom of the body.
Great Post! You gave quality information and resources. All of which will keep me busy for days getting things right... and that is a good thing.
So many points covered in one post. Amazing work. Will share with my developers. After my team reads this, I am sure they will be able to further reduce our websites page load time.
Speed is critical. Everyone likes things fast. Why the wait? Speed optimizing any website will give it an extra edge over its competition and a boost in rankings. I always implement speed optimization to all my clients regardless if they pay for it or not.Speed optimizing a website should be essential but not all web developers are implementing such practice.
About using sprites: I DONT AGREE!
It actually might hurt you!
I've ran a research recently if CSS sprites have effect on SEO. I've found out that it actually has **BAD** influence, especially if you use mod_pagespeed, which Google is pushing now.
Basically, using sprites would slow your pages down, because it wouldn't allow mod_pagesped to inline smaller images, and as one of the guys said earlier, it doesn't make sense to sprite large images.
So, You can still keep all ALT tags, captions, descriptions etc, while actually speeding up loading times of pages.
Here is link to full research:
CSS Sprites in mod_pagespeed enviroment
I know that this is a 5 year old post yet, it's a very relevant one in 2016. I do have a question and hope that someone can help.. I have mod_pagespeed installed in my apache server and it does a wonderful job in so many ways specifically image optimization. Now, from reading up on metadata and the way pagespeed handles it, part of its compression techniques relies on stripping all metadata info. I never had a need to pay attention to this, but i've found myself recently wanting to classify my images using the built in meta properties. Knowing that pagespeed will strip this, i found a solution to this by adding
data-pagespeed-no-transform
in the html when requesting the image. Doing so didn't yield any results, no metadata.Can someone chime in on this? I can't find anything about this feature and don't know if there's some setting that need to be done on the server side.
Very useful content and thank you for clarifying what errors or recommended fixes are.
Under the databases section, you might want to get a little bit more specific for some of those who aren't familiar. It might make sense to mention things like memory based caching (memcachd), mysql wire compliant memory based alternatives (memsql) or general SQL query optimzation.
Great post, by the way!
Awesome post - many of the key optimization techniques are highlighted. Google's obsession with speed now not only includes tools but integrated speed assessments in Google Analytics. All that hard work driving traffic to slow pages that does not convert does not make sense in the age of broadband. (Disclosure) I work for Yottaa.com - we offer a free Web Performance Monitoring service that helps you visualize the performance of webpages and reccomends ways to improve performance. 10 days ago we launched a Private Beta of our Web Performance Optimizatio service - Yottaa Optimizer. It's a cloud-based service that can double the speed of our website with just a few clicks - no code changes, software downloads or hardware purchases are needed. There are still spots available in the free private beta, check it out and see how fast your website can be! William Toll - Yottaa.com
I like most of the tools that you described in this post, however, sometimes the Google Webmaster Console can do the trick.
Why is seomoz so slow? Each tool and even the site. Would be nice to get a member of Moz to respond why the site is so incredible slow. Would be different if it was just an non paying informational portal but it's not. This isn't a new issue it's been a persistant issue for years why isnt any one talking about it. It's cool to be community driven and supportive but at the same time it would be nice to be provided with information on how this problem is going to be solved or when...
Nice post. Truly informative.
I use W3 Total Cache for my WP site. Though the plugin doc says it adds "expires" tags etc. the evaluator site shout about missing expires etc. It is difficult on whom to believe. I find using CDN with W3TC is deceptive because out of 6 scripts, only 3 are on CDN, why? The combined scripts file (courtesy W3TC) is supposedly uncompressed too: whose fault it is?
I use two google fonts, and all I can do is just call them. Again the evaluator says it has short expires (1 day), what to do if Google cannot take corrective measures?
I had some problems with the minify plugins, so I compressed, minified and uploaded the scripts. Yet evaluators scream "this could have been compressed by xx%, "unrealistic reporting".
Most bloggers use WordPress. If I was to join the javascript files, how to tell the php files which file to look for? I think it is time WP does some homework, join these files, edit the calls in php files before putting a new version.
The Pingdom site is good but deceptive too. Test it once and it shows overall loading time as 0.9 seconds. Test it a second time and you get 3.7 secs. PageSpeed gives a rank of 97, Yslow gives as C, various site loading speed calculators give unbelievable speeds from 1.8 to 11.9 seconds. Farcical, isn't it?
Wp-cycle, a rotator plugin, has no provision for putting Alt text for images.
If everyone is concerned with internet bandwidth, some joint efforts should be made conserving it.
Great post Ed. Also thanks for clarifying you are not related to Richard :).
I started optimizing for page speed around 10 months ago now and I have to say I have seen some major improvements by just minifying the external objects.
YSlow and Pingdom is what I have been using and I definitely recommend them.
Have you tried mod_pagespeed (assuming you're using apache that is)?
https://code.google.com/speed/page-speed/docs/module.html
IMHO it is, utterly awesome for improving speed and handling issues with minifying javascript/css etc.
For anyone interested in speeding up their site without having to learn and apply all these best practises by hand, check out https://torbit.com. It's an automated service that can apply all these optimizations (and more) to your site automatically. We're still in beta, but we've already shown that we can double the speed of an average website.
Hey, great info, loading speed is one of those things that can be really off putting for visitors and with the release of the GA tool for easy testing it kind of gives them a reason to make a bigger deal out of it. Loading speed is another good sign of overall site quality so... ignore it at your peril I feel.
One thing I often see, in the UK at least and with smaller clients especially is sites hosted on pretty crappy hosting packages. Often these have been resold and just don't hold up during peak times so your loading times can be affected by the myriad other sites hosted on the same machine.
An easy way to check is with a reverse IP lookup and there are simple online tools to do that like the one here:https://www.yougetsignal.com/tools/web-sites-on-web-server
The only problem is what to do if your hosting package is not quite up to spec. There are various options and there are companies that provide high quality shared hosting services with a control panel if you are looking to keep it simple. There are also dedicated servers but there can be problems with this approach if you don't have sys admin skills.
One of the best approaches we are seeing in terms of reasonable costs, reliablitity and getting some dedicated resources is cloud hosting. I am in the UK and Rackspace provide a great hosted cloud server with simple backups & restorations, reasonable costs and the ability to scale the solution in moments.
Anyhow, great post, going to check out the wordpress cache plugin now :)
Marcus
Great post! Congrats! ... I specially liked the CSS Sprite trick...I will definetively start using it as of today. Thx for sharing!!
If you are using Wordpress there are some great plugins that can handle a lot of this for you (minifying / compressing / caching etc). However, I have not come across a WP plugin for browser caching, does anyone know of one?
i think most of them do it. however, W3 Total Cache ( https://wordpress.org/extend/plugins/w3-total-cache/ ) does it for sure (it's the only plugin i use, so can't be sure about others).
Using the page speed plug in for Firefox my pages score in the upper nintes, averaging around 98. But according to Google Webmaster tools my site is only 67% faster than most sites. It seemed to happen when I moved all of my images to a cookies less domain. I'm now hosting my images on Amazon S3 but Google isn't playing along.
Where is the dissconnect? Could it be a problem with the server?
Google webmaster shows you the time taken to download your page. If it is above 4 seconds, then GWMT may say your site is slower than 60% of sites.
PageSpeed gives you marks in scale of 1 to 100.
Hyper cache sounds awesome. I'm giving it a go!
Great breakdown. Very easy to understand. I worked through many of these issues and we stared with a score of 69 and are now performing at 85. I still have a question. Why is my page speed slowing down according to Google Webmaster even though it has improved on all these tools? Any ideas? https://www.tel3.com/
Your suggestions are great. We've had good results over the past several months moving from the low 80s to low 90s. My best day last week was having our site go below the evil red line in the Performance Overview graph meaning we are in faster than 80% of sites.
One of the things that helped our page load speed was moving our "trust me" and "secure ssl" type third party javascript based logo links from being shown on every page to a few select pages. Jumped about 4-5 points in PageSpeed.
The social media share everything javascript based links also slowed us down. We switched to a few of the most popular image based hyperlinks (check the ones on SEOmoz, for example) and pages load faster.
Edited to show several paragraphs instead of one gigantic one. Hope it works.
Great Post! It inspired me to optimize my home page, and it's now literally 3-10 times faster depending on your internet connection. Before we had 70 resources (the design was done by somebody a few years ago), and now we have 12. On top of this the size has come down from 350K to 150K, not too shabby I'd say! All without actually changing the design.
I did it largely through CSS sprites, what a great technique.
However, the "Leverage Browser Caching" message was cryptic, and tough for me to implement, so here's a link to some "use-as-is" code that I was able to plug in and use. https://cjohansen.no/en/apache/using_a_far_future_expires_header. I'm looking forward to seeing how this affects our visitors and rankings!
Slowcop should definitely be on this list. (www.slowcop.com)
Perfect start into the week with that useful post including handy tips!
Optimizing your website load time can be an important factor when it comes to overall potential for you websites success.
https://www.ave-nir.com
Hi ¡¡¡¡, good post, I'm a fanatic of Page Speed optimization like a very important parameter for search engine's indexation. I'm trying to activate the gzip compression in a site but I'm not be able to do it.
I follow the advices of different from different sources (usually are the same tips).
Anybody knows about a good link with the information about the how activate gzip compression in an Apache server, step by step ??
Thanks,
1. Be very careful while working on the htaccess file. I am saying this from experience.
2) I know I am replying really late, but have you managed to execute Gzip compression for your website? It took me few hours to study and understand gzip & I managed to get it done for both a static site & on a wordpress blog. have written about it here - www.kittuk.com/2011/05/enable-gzip-compression-for-your-website-and-reduce-webpage-load-time/
It's really simple and If you read about it thoroughly & understand what your website needs, it becomes simpler.
Nice infrrmation about increase the website speed. lot of website taking more time load the website content. this article will help to many people in this domain.
thank you.
Excellent post. Another valuable resource is the analyzer located on Website Optimization, https://www.websiteoptimization.com/services/analyze/ - it'll show you a list of every HTTP request, if it's been compressed, and the total loading size of the page you put into it's queue. I've had sites go from 250+ HTTP request, down to 85 with little effort, just by following the results from that tool.
https://www.webpagetest.org is also a great tool, I always start and benchmark with it as it gives you an awesome visual breakdown of every request. You also have a number of options of where to test from, and with which browser.
Great post! Great that GoogleAnalytics is now tracking load speeds - I would also recommend https://tools.pingdom.com/ which gives a great visual display of webpage element load times.
I agree - this is an awesome tool and very helpful, much better than Google's Page Speed
Ed, do you have a favorite tool or method for creating CSS sprites?
I actually create them by hand but I know there are a lot of them out there. I've heard good things about https://spriteme.org/
I use this CSS Sprite Generator:
https://spritegen.website-performance.org/
You upload a Zip containing all the images, and it returns a sprite for you to use with the CSS already set up.
Works flawlessly for me.
Great post. If you ever do a follow-up, can I suggest an entire post on Leveraging Caching. I get that suggestion too on my site and I think the documentation with it is really lacking.
Thanks Sarah, I'll try and write a good article on it in the not too distant future. I agree, the docs out there aren't great.
In my comment above I linked to a blog(?) post that clarified it for me... and gave me the htaccess code to use. What I'm still iffy on is how browsers handle using only the latest version of the file. Usually if you're using far-future caching you need to change your file names to get the browser to use the new version, but I haven't had a problem with it so far... and Google does say that my caching is working.
Here's the post: https://cjohansen.no/en/apache/using_a_far_future_expires_header. It's the only one I could find with simple code that works.
BTW, Ed, I don't think I saw you say anything about losslessly optimizing your PNGs... using a Linux its extremely easy to download something like OptiPNG and use the command line to downsize your files by 10-30% without compromizing any quality. There's a solid 6-7 programs that do this easily, I bet some even handle batch loads.
I look forward to a complete post on leveraging browser caching too. It is important to note though that trying to control caching can have very negative affects as well if you are not careful. For example, many sites (especially ecommerce sites) have content that should not be cached, such as forms and receipts that a visitor generates when doing things like going through a checkout. If they try to go back and edit or view those steps again they will often have difficulties doing so and that will usually cause them to just leave the site- equalling a loss of sales. I haven't been able to find any descriptive documentation so far about these issues, best I can recommend is to test as much as possible- especially for ecommerce sites.
Thanks for the great post. There are lots of posts that share general knowledge, but it's the practical actionable blogs like this that are really powerful. Great suggestions and advice!
Some good points, speed of my sites/cleints if deffiantly on my radar ;) I like how you have made every thing in a easy to understand fashion too with some great examples, I have sent this post to some people at work =)
Enjoyable read Ed!
Tell me - are we related?!
:-)
Thanks Richard!
Not that i'm aware of but we both seemingly have SEO in our blood :)
Not all Brits are related, even if you do all look alike to Us Americans. Besides, your last name is "Baxterseo" ;)
Thanks for clearing that up, Pete!
Nice Post.