During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.
The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.
Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it's a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).
In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.
Google Search Console
Firstly, we should clarify our understand of what the "time spent downloading a page" metric in Google Search Console is telling us. Most of us will recognize graphs like this one:
Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):
John clarified what this graph is showing:
It's technically not "downloading the page" but rather "receiving data in response to requesting a URL" - it's not based on rendering the page, it includes all requests made.
And that it is:
this is the average over all requests for that day
Because Google may be fetching a very different set of resources every day when it's crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.
For that reason, John points out that:
Focusing blindly on that number doesn't make sense.
With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).
Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.
Googlebot & the Web Rendering Service
Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like "Fetch as Googlebot" in Search Console, and is increasingly what Googlebot is using when it crawls pages.
However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:
At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.
Chrome User Experience Report
Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”
Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.
In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.
However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they're using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).
At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:
The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you're into that sort of thing!
We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.
Pay attention to users
Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.
The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.
Essentially, this means that there's no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.
If you are unsure where to look for site speed advice, then you should look at:
- How fast is fast enough? Next-gen performance optimization - the 2018 edition by Bastian Grimm
- Site Speed for Digital Marketers by Mat Clayton
That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!
Because of limited time and too many priorities, I find it's helpful to use other site speed tools such as GTmetrix or Pingdom first because they provide more detail, then after making improvements go to PageSpeed to see if the numbers improve.
Early on, I discovered another important use of PageSpeed that had nothing to do with site speed. It saw my site differently in mobile than my own phone and other tools. So I had to make changes to conform to Google's own standards to get the score back up.
I share your vision. Pagespeed is the base, but then for important optimizations you have to use other tools because it is clearly insufficient
I think it's good for pages that are already well optimized and to improve those that do not have it fast. We have in Search Console less than 1000 milliseconds but the Insight note is not so good. We will have to work this. Apart from the Google Insight tool, I also recommend another one from Google that shows you the speed, how many users you lose for it and the speed of the webs of your same sector: https://testmysite.withgoogle.com/intl/en-gb
Interesting post Tom. Thank you so much for the information
Prestashop is doing a good job with ecommerces. Your site has a fantastic speed,
congratulations, Jonathan!
Thank you so much estentor!!
Awesome post TA!
Is there an API that might be easy to integrate into audits or cron jobs to keep an eye on things? Also, which speed metrics are you looking most closely at?
Thought this was neat & further confirms your idea of focusing on the users: User-centric Performance Metrics
Thanks!
Hey Britney! Thanks! :)
Perhaps the more interesting thing, rather than what metrics I'm interested in, is those that Google is recording. We're able to see those inside the Chrome User Experience Report BigQuery Project. I think of those the one that I see most people not talking about is First Contentful Paint, which Bastian discusses in the deck that I linked to. Bastian (rightly) points out that First Meaningful Paint is even more important, but it is pretty difficult to measure that in an automated fashion.
In the same presentation, Bastian outlines a method of recording this data into your GA if you want to keep an eye on things. I will admit that I haven't done that and instead I like the 'old school' approach of regularly browsing and spot checking sites I'm looking after.
Thanks for sharing the link -- it looks like they are also discussing First Meaningful Paint. I wonder how long it will be until Google attempt to ML what constitutes a Meaningful Paint, such that they can measure this??
This does not surprise me at all. I'd even go as far as to say that I expect Google to use Chrome data to measure many other usage statistics and feed those back to the search algorithm.
For example when users don't click anything on your site Chrome knows that. It would be anive to assume Google that these data point would be ignored in ranking websites.
Hi Tom, Thanks for sharing such an awesome post.
I have a query, for many of the website I work on; the google page speed insight shows Unavailable in the speed column. Can you please guide me what could be reason for this or how can I solve the issue as the Optimization is Good for the website but the Speed Analysis are unavailable.
Thanks once again for the informative post, I learnt several new info.
Hi Akash! :)
I believe that is because for PageSpeed Insights that speed data is now using the public Chrome User Experience Report data. However, that certainly does not cover all websites and so that is likely why you are seeing that. It isn't a reason to worry if everything else is looking healthy. It does mean, however, that you will have to use other methods to measure site speed.
Hi Tom,
Thanks for your response, as suggested I will be using other methods to measure site speed.
In PageSpeed Insights our website scores 79/100, it can be improved but at the moment we are happy.
Thank you very much for this post, Tom, it is very complete and interesting.
Best Regards
Hello Estentor, do you have any idea on what is the average Pagespeed score of your competitors? Any tool to helps us get them faster or do you have to research them one by one at the Page Speed Insights page?
Hi Tom!
I always use all the data points available, but sometimes I get very different numbers comparing Page Speed Insights, the Mobile Website Speed Test (https://testmysite.withgoogle.com) and other sources. Not sure what they consider in this test besides simulating a 3G connection and Moto G4 smartphone.
Just looking at a specific URL now and while Page Speed Insights gives me a great 0.8s FCP + 1.5s DCL for mobile, the mobile speed test says the pages takes 8-9 seconds to load.
From your post it seems we should give much more attention to the "new" metrics on Page Speed Insights (FCP and DLC), which is used as a speed reference for SEO and is more granular. Does that make sense?
Hey Gustavo!
Yes - what you are saying makes sense to me, and that is probably the direction I would lean in. However, in my chats with John Mueller, he was eager to stress that they still use a variety of metrics for gauging speed. Therefore, we need to make sure that we don't get fixated on this new metric, but continue to pay attention to all the data points.
In your case I would want to understand why there is such a discrepancy between different data sources (there may be a good reason, but we should ensure we understand it), and then it possible to make an informed decision about what may need doing (if anything).
It seems to me a measure in websites with visits with a high number of visits introduces greater realism. But on websites with small traffic it can produce biases in the statistical sample. If we have a lower daily range of visits, particular browser or device issues can influence the measurement data in a meaningful way. Even, nobody has thought that when carrying out this test outside the control of Google could the users damage the load times of their competitors? With measures such that slow web loading times.
All this clear in projects with low volume of traffic;)
PS! Don't forget crawlbudgets.
Thanks for sharing this Tom.
Awesome news and I agree that it is a positive move.
Hi, Tom
Thanks for sharing nice and knowledgeable post. There is the main point, How to optimize the website with Google Console? Yes Google Console is very helpful to crawl and optimize the website with very easy steps. It's show all about the website structure and keyword position. If any page are not crawled by google, you can personally crawl and optimize the website page. And if we are talking about the website speed, than we follow the some optimizing tools, which optimize image, CSS/java script and more. They are normally available on internet. After use it, the page speed will auto increase.
When dealing with Website Speed you need to look at "Waterfalls".
I really get annoyed seeing so many SEO's talk about Website Speed and either talk rubbish or are completely useless at the subject. I'm not saying you are part of that. But quoting useless tools like Google PageSpeed or Google Webmaster Tools thinking that will really help you fix website loading speeds. This is so wrong in so many ways!
If you are serious about wanting to know about Speed then you must go and learn some basic things like:
R.A.I.L. (Response Animation Idle Load) and R.U.M. (Real User Monitoring) for a starting point.
Then new things like Painting Timing API.
Use proper tools like:
webpagetest
lighthouse
devtools
Also if you are really serious about Speed and Performance then throw away rubbish cms like Wordpress in the bin and move over to modern frameworks and libararies and use PWA's.
With PWA's you will understand the basic concept of "App Shell" and see how we create the first load in milliseconds and repeated load times in zero seconds using service workers etc.
Your comment has some useful information. For the record, some of us work in small or one-person shops. We may not have your level of technical expertise to do what you describe. So I hope we agree that suggestions depend in part of who needs the help with site speed.
Hi Scott, thanks for reply. I tell you what, if Moz allows it I will write a technical post about Speed. I could explain to a 3 year and make them understand how to do technical SEO and get high quality results. Not saying you are a 3-year old, but I know I can show you how to do it quite easily. My only requirement would be for Moz to update their website to allow code examples, so I would suggest Moz to install Prism.js
Hi, Sally. I'd like to see such a post. Time is precious for me because I'm a lone wolf operation, so I'm always on the lookout for any improvements I can make that are simple and quick.
There it is! I was always sure that google must use user-driven data to judge about a website's speed. Anything else wouldn't make sense.
This is a very critical peace of information. Now hosting and on page optimization, specialy at the home page but also on all images and plugins on the product page is more critical than ever to help us rank higher!
Amazing post Tom, but I have a question:
How do I check data usage in chrome?
Regards,
Great article, congratulations.
This shows what many of us think, Google uses all the information at your fingertips, in fact in general lines it is the information collected its greatest value.
It is a very good article. Though I work by my own and don't have much experience I must say that measuring the speed from different data sources showed me that the results are different. So which of them makes any sense?
Thanks for sharing such important info and that is good news for the sites that optimize according to google!
Thanks once again for the informative post Tom.
Hi Tom,
Thanks for sharing. Nice and very informative article. Does page speed affects the ranking of a website?
Yes it does. We completed 1st quarter audits for all of our SEO clients and mobile responsiveness was #1, page loading speed is #2. But in reality, loading speed is #1 because of mobile-first indexing. If it doesnt display fast enough on mobile, you're going to take a big hit in rankings. So be sure to check your website in mobile-speed-testing software.
Thanks for explaining this, Tom. No big surprise at all though. Bottom-line: There is nothing like a free meal, right?