If you're a webmaster, you probably received one of those infamous “Googlebot cannot access CSS and JS files on example.com” warning letters that Google sent out to seemingly every SEO and webmaster. This was a brand new alert from Google, although we have been hearing from the search engine about the need to ensure all resources are unblocked—including both JavaScript and CSS.
There was definite confusion around these letters, supported by some of the reporting in Google Search Console. Here's what you need to know about Google’s desire to see these resources unblocked and how you can easily unblock them to take advantage of the associated ranking boosts.
Why does Google care?
One of the biggest complaints about the warning emails lay in the fact that many felt there was no reason for Google to see these files. This was especially true because it was flagging files that, traditionally, webmasters blocked—such as files within the WordPress admin area and Wordpress plugin folders.
Here's the letter in question that many received from Google. It definitely raised plenty of questions and concerns:
Of course, whenever Google does anything that could devalue rankings, the SEO industry tends to freak out. And the confusing message in the warning didn’t help the situation.
Why Google needs it
Google needs to render these files for a couple of key reasons. The most visible and well known is the mobile-friendly algorithm. Google needs to be able to render the page completely, including the JavaScript and CSS, to ensure that the page is mobile-friendly and to apply both the mobile-friendly tag in the search results and the associated ranking boost for mobile search results. Unblocking these resources was one of the things that Google was publicly recommending to webmasters to get the mobile-friendly boost for those pages.
However, there are other parts of the algorithm that rely on using it, as well. The page layout algorithm, the algorithm that looks at where content is placed on the page in relation to the advertisements, is one such example. If Google determines a webpage is mostly ads above the fold, with the actual content below the fold, it can devalue the rankings for those pages. But with the wizardry of CSS, webmasters can easily make it appear that the content is front and center, while the ads are the most visible part of the page above the fold.
And while it’s an old school trick and not very effective, people still use CSS and JavaScript in order to hide things like keyword stuffing and links—including, in the case of a hacked site, to hide it from the actual website owner. Googlebot crawling the CSS and JavaScript can determine if it is being used spammily.
Google also has hundreds of other signals in their search algo, and it is very likely that a few of those use data garnered from CSS and JavaScript in some fashion as well. And as Google changes things, there is always the possibility that Google will use it for future signals, as well.
Why now?
While many SEOs had their first introduction to the perils of blocking JavaScript and CSS when they received the email from Google, Matt Cutts was actually talking about it three-and-a-half years ago in a Google Webmaster Help video.
Then, last year, Google made a significant change to their webmaster guidelines by adding it to their technical guidelines:
Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.
It still got very little attention at the time, especially since most people believed they weren’t blocking anything.
However, one major issue was that some popular SEO Wordpress plugins were blocking some JavaScript and CSS. Since most Wordpress users weren’t aware this was happening, it came as a surprise to learn that they were, in fact, blocking resources.
It also began showing up in a new "Blocked Resources" section of Google Search Console in the month preceding the mobile-friendly algo launch.
How many sites were affected?
In usual Google fashion, they didn’t give specific numbers about how many webmasters received these blocked resources warnings. But Gary Illyes from Google did confirm that they were sent out to 18.7% of those that were sent out for the mobile-friendly warnings earlier this year:
@jenstar about 18.7% of that sent for mobile issues a few months back
— Gary Illyes (@methode) July 29, 2015
Finding blocked resources
The email that Google sent to webmasters alerting them to the issue of blocked CSS and JavaScript was confusing. It left many webmasters unsure of what exactly was being blocked and what was blocking it, particularly because they were receiving warnings for JavaScript and CSS hosted on other third-party sites.
If you received one of the warning letters, the suggestion for how to find blocked resources was to use the Fetch tool in Google Search Console. While this might be fine for checking the homepage, for sites with more than a handful of pages, this can get tedious quite quickly. Luckily, there's an easier way than Google's suggested method.
There's a full walkthrough here, but for those familiar with Google Search Console, you'll find a section called “Blocked Resources” under the “Google Index” which will tell you what JavaScript and CSS is blocked and what pages they're found in.
You also should make sure that you check for blocked resources after any major redesign or when launching a new site, as it isn’t entirely clear if Google is still actively sending out these emails to alert webmasters of the problem.
Homepage
There's been some concern about those who use specialized scripts on internal pages and don’t necessarily want to unblock them for security reasons. John Mueller from Google said that they are looking primarily at the homepage—both desktop and mobile—to see what JavaScript and CSS are blocked.
So at least for now, while it is certainly a best practice to unblock CSS and JavaScript from all pages, at the very least you want to make it a priority for the homepage, ensuring nothing on that page is blocked. After that, you can work your way through other pages, paying special attention to pages that have unique JavaScript or CSS.
Indexing of Javascript & CSS
Another reason many sites give for not wanting to unblock their CSS and JavaScript is because they don’t want them to be indexed by Google. But neither of those files are file types that Google will index, according to their long list of supported file types for indexation.
All variations
It is also worth remembering to check both the www and the non-www for blocked resources in Google Search Console. This is something that is often overlooked by those webmasters that only to tend to look at the version they prefer to use for the site.
Also, because the blocked resources data shown in Search Console is based on when Googlebot last crawled each page, you could find additional blocked resources when checking them both. This is especially true for for sites that may be older or not updated as frequently, and not crawled daily (like a more popular site is).
Likewise, if you have both a mobile version and a desktop version, you'll want to ensure that both are not blocking any resources. It's especially important for the mobile version, since it impacts whether each page gets the mobile-friendly tag and ranking boost in the mobile search results.
And if you serve different pages based on language and location, you'll want to check each of those as well. Don’t just check the “main” version and assume it's all good across the entire site. It's not uncommon to discover surprises in other variations of the same site. At the very least, check the homepage for each language and location.
Wordpress and blocking Javascript & CSS
If you use one of the "SEO for Wordpress"-type plugins for a Wordpress-based site, chances are you're blocking Javascript and CSS due to that plugin. It used to be one of the “out-of-the-box” default settings for some to block everything in the /wp-admin/ folder.
When the mobile-friendly algo came into play, because those admin pages were not being individually indexed, the majority of Wordpress users left that robots block intact. But this new Google warning does require all Wordpress-related JavaScript and CSS be unblocked, and Google will show it as an error if you block the JavaScript and CSS.
Yoast, creator of the popular Yoast SEO plugin (formerly Wordpress SEO), also recommends unblocking all the JavaScript and CSS in Wordpress, including the /wp-admin/ folder.
Third-party resources
One of the ironies of this was that Google was flagging third-party JavaScript, meaning JavaScript hosted on a third-party site that was called from each webpage. And yes, this includes Google’s own Google AdSense JavaScript.
Initially, Google suggested that website owners contact those third-party sites to ask them to unblock the JavaScript being used, so that Googlebot could crawl it. However, not many webmasters were doing this; they felt it wasn’t their job, especially when they had no control over what a third-party sites blocks from crawling.
Google later said that they were not concerned about third-party resources because of that lack of control webmasters have. So while it might come up on the blocked resources list, they are truly looking for URLs for both JavaScript and CSS that the website owner can control through their own robots.txt.
John Mueller revealed more recently that they were planning to reach out to some of the more frequently cited third-party sites in order to see if they could unblock the JavaScript. While we don’t know which sites they intend to contact, it was something they planned to do; I suspect they'll successfully see some of them unblocked. Again, while this isn’t so much a webmaster problem, it'll be nice to have some of those sites no longer flagged in the reports.
How to unblock your JavaScript and CSS
For most users, it's just a case of checking the robots.txt and ensuring you're allowing all JavaScript and CSS files to be crawled. For Yoast SEO users, you can edit your robots.txt file directly in the admin area of Wordpress.
Gary Illyes from Google also shared some detailed robots.txt changes on Stack Overflow. You can add these directives to your robots.txt file in order to allow Googlebot to crawl all Javascript and CSS.
To be doubly sure you're unblocking all JavaScript and CSS, you can add the following to your robots.txt file, provided you don't have any directories being blocked in it already:
User-Agent: Googlebot Allow: .js Allow: .cssIf you have a more specialized robots.txt file, where you're blocking entire directories, it can be a bit more complicated.
In these cases, you also need to allow the .js and.css for each of the directories you have blocked.
For example:
User-Agent: Googlebot Disallow: /deep/ Allow: /deep/*.js Allow: /deep/*.css
Repeat this for each directory you are blocking in robots.txt.
This allows Googlebot to crawl those files, while disallowing other crawlers (if you've blocked them). However, the chances are good that the kind of bots you're most concerned about being allowed to crawl various JavaScript and CSS files aren't the ones that honor robots.txt files.
You can change the User-Agent to *, which would allow all crawlers to crawl it. Bing does have its own version of the mobile-friendly algo, which requires crawling of JavaScript and CSS, although they haven't sent out warnings about it.
Bottom line
If you want to rank as well as you possibly can, unblocking JavaScript and CSS is one of the easiest SEO changes you can make to your site. This is especially important for those with a significant amount of mobile traffic, since the mobile ranking algorithm does require they both be unblocked to get that mobile-friendly ranking boost.
Yes, you can continue blocking Google bot from crawling either of them, but your rankings will suffer if you do so. And in a world where every position gained counts, it doesn’t make sense to sacrifice rankings in order to keep those files private.
Hey Jennifer,
Great article!
Could you please share a real scenario where unblocking JavaScript and CSS resulted in improved rankings?
It is not as simple as "Unblock JS/CSS and move up ten spots!" and it is just one of the hundreds of ranking factors. It could be very similar to the HTTPS ranking boost where we see it make the most difference in super competitive serps where every tiny thing counts. I consider it a best practice and probably one of the easiest things you can do to improve rankings, even if Google doesn't tell us exactly how much that impact is.
Actually WordPress may have some issue with generating robots.txt. Since file didn't present in filesystem it's generated from WP itself each time when someone request it. And this can cause CPU timings in hosts where time is counted. Even in VPS or Dedicated hosts this can return issues since robots.txt can't be cached.
Solution is to create in WP folder simple robots.txt with this:
----
User-agent: *
Disallow:
Sitemap: https://SITE/sitemap.xml
----
Just replace sitemap with actual sitemap location. It also can be sitemap.xml.gz too.
@Peter
Most probably the perfect solution , I already use sitemap.xml.gz on some sites.
And throw in Wordpress plugins also trying to generate/write/change robots.txt it can definitely be an issue, especially for those who aren't as web savvy.
Yeah...
BTW - WordPress using "virtual file" to generate robots.txt, so this solution simplify it and get small boost in terms of processor usage.
Hello Jennifer.
Great article. I am not seeing anything listed as blocked in other resources, however I am seeing warnings in the Page Speed tools about "Eliminate render-blocking JavaScript and CSS in above-the-fold content". This is affecting our speed and I've given to the IT team to sort but they're having difficulty.
Do you consider this issue the same?
Regards,
Robert
Definitely run it through Google's checker - pick the homepage or whichever page has those resources - and make sure it isn't showing the error. Depending on when Google has last crawled, there could be a delay.
The render-blocking issue is different - it is referring to either JS or CSS you have running on parts of the site that are "above the fold" but that won't allow the page to render correctly until it loads. Think of sites that are trying to call a third-party script that isn't responding, and you are stuck waiting on a blank screen until it responds. This is what Google is talking about, but on a (hopefully) shorter time scale.
Google has their recommendations here: https://developers.google.com/speed/docs/insights/BlockingJS
Hi Jennifer,
Many many thanks to you, you chose right topic for discussion.Because i think its major issues in SEO and many Marketing executive worry about it. I also facing this kind of problem is my webmaster tools .so i got lots of new things here and of course solution of this problem.and thanks once again for explained more about this topic.
@Jennifer
Thank you for the refreshing the scripts' things, really loved it. With this, CDN is also the fine solution for JS/CSS
As a user of joomla i have fixed the issue from robot.txt file. GWMC sent me a message to optimise the folders google not able to reach.
So i did as follow and google bot was happy with it.
User-agent: *
Disallow: /administrator/
Disallow: /cache/
Allow: /templates/template/css
Allow: /templates/template/js
After you are done wth robots.txt file do as follow:
Login to your Google Webmaster account dashboard and do as follow:
Crawl / Fetch as Google. This option allows you to check how the Google bot interprets your site on the web. I set the drop-down option to Mobile - Smartphone:
Then click Fetch and Render and let the google bot do it's work. After a few seconds, it showed the result:
Thanks for adding the Joomla info - I know it is much harder to come by info on this for non-Wordpress CMS.
in Wordpress there are couple of extensions regarding the topic but there is no extension for joomla. so we have to do it our self.
Hey Jennifer,
Great article!
I think that is inevitable in SEO. Before, i often thinking Speed load Pages is very importan, users can't wait too long, Js & Css will make your site slowly.
And NOW, I will have to rethink this !
There's plenty of ways to optimize your JS and CSS. Google offers one way that we covered here: https://www.thesempost.com/google-pagespeed-insights-optimize-site-images-css-javascript/ I believe it is still in beta as it wasn't officially announced. And always make copies of your files before you start in case things go sideways :)
Awesome work Jennifer. We have tested out several sites with unblocking resources for Google. Most of the time ranking slightly improved for sites with more content and pages. On the other hand on smaller sites we haven't noticed significant change.
Did you find it had a larger impact on the sites in more competitive spaces where every bit of SEO tends to count?
After a long time I got a post regarding "Unblock JavaScript & CSS" hope with this new treat google will not harm more rankings... And Thanks to Jennifer Slegg :) Good luck for your Post..!!
Thanks Jennifer for nicely compiled article :-)
Thanks for this article Jennifer. It has been clarify some confused issues.
Thanks for all your advice Jennifer .
I agree with you, it´s a good advice and i just say......thank you so much for sharing a useful content.
Great article and nice tips! thanks
Great advice Jennifer, I concur whole heartedly. It is apparent that critical CSS and JS is returning to being dumped into the DOM, rather than referenced in separate files causing render blocking issues.
The programmer in me wants everything to be elegant and neat. At first look I was a bit frustrated to see Google (and most major tools like GTMetrix) basically recommending you program a crappy website (from a coding perspective).
We've all known for at least a decade now that implementing and staying current with W3C standards is beneficial to your website. Now Google is telling us to break some of those best practices. Imagine that, Google telling us to do something while not really explaining it at all.
I get it though, its about network speeds and the user experience. End users could care less about how lean and well organized the markup is, they just want the website to load fast on their device. If you have to dump some raw CSS or JS to achieve a faster download time (and thus a better user experience), that's what its all about.
And also a case where the few bad apples ruined it for everyone by blocking CSS/JS for spam reasons.So it isn't surprising that Google wants to be able to crawl and render the entire page for ranking purposes, to prevent the spam/surprises, especially since that impacts the searcher and user experience.
*Couldn't care less
:)
Hi Jennifer! Nice tips and great article. Thanks!
Excellent article - thanks Jennifer :-)
I've a question
1 .JSON$ and
2. /blog/wp-content/ is blocked in robots.txt
should I need to worry about them?
Check your blocked resources or just run it through Google's checker... if they have a problem with it being blocked, they will tell you. If they don't give a warning message for it, then it is fine to continue blocking it - so long as they don't change that in the future.
Javascript and CSS influence in yours website load time. This is the main reason for place it in the bottom of your html code.
Thank you for your contribution Jennifer.
A few days ago some robots.txt change now waiting to see the results.
Thank you!! :)
The real question is, why did people start blicking them to begin with? What did they think was going to happen?
Hi Ethan,
The reason was to ensure that Google's spiders were not wasting their time crawling non-rankable pages. Google's spiders only spend a short time on your site and many SEO's wanted to make sure the spiders indexed more pages instead of wasting time on CSS and JS.
That assumes a very large level of ignorance on the behalf of Google. It's a part of that mentality that says, "I have to trick them in order to rank higher." Rather than working with them and making the bots life as easy as possible. Of course blocking JS and CSS would make it harder to understand a website. Get out of that black hat mentality and it's obvious.
Yes, definitely some of the decisions Google makes re: ranking can often be traced back to how spammers were manipulating it and using it to their advantage. The same with any industry really, a few bad apples spoil it for everyone else.
Yes, some was crawl budget related. Some blocked for security reasons. And some blocked because they thought (wrongly) that Google could index CSS/JS and show them as "pages" in the search results.
having this issues on couple of client's website, CMS is wordepress. In most of the instances the resources are not blocked in Robot.txt and when you render the page it will show up as complete however, when you go to the block resources page in GWM it shows the blocked resources and indicating the status date as yesterday. In one of the instances when you render the page it shows up as Partial and once you click on it it shows more than 10 resources however, they have not been blocked in robot txt. I searched all over the Internet for answers however, found none. Could please help me out? what causing the issue and how to fix it? Thanks.
I would look at third party plugins that could be affecting it, especially if it is an SEO plugin and particularly if they are not updated. You can also try posting on the Google Webmaster Help Forums, you can include the URL and people can see if there is anything funky going on.
This could be priceless:
User-Agent: Googlebot
Allow: .js
Allow: .css
If you have a more specialized robots.txt file, where you're blocking entire directories, it can be a bit more complicated.
In these cases, you also need to allow the .js and.css for each of the directories you have blocked.
For example:
User-Agent: Googlebot
Disallow: /deep/
Allow: /deep/*.js Allow: /deep/*.css
The reason is I've plenty of sites that are getting a 'blocked resources' status in Google's Search Console using the command Peter Nikolow mentions above:
User-agent: *
Disallow:
So hopefully Gary Illyes shared info will clear this up.
Thanks for sharing.
Hi Jennifer,
Very fascinating article. I will love to see the case study on it and the results after allowing JavaScript and CCS on your website.
I suspect we will see some trying to do this. But there are so many contributing factors in the ranking algo - not to mention Google is constantly updating and tweaking it - that it will be hard to see any kind of "I unblocked JS/CSS and now rank number one!"
Hi Jeniffer,
Congrats for your post ;). If we unblock the js and css files on our Wordpress sites and consequently the /wp-admin directory...don't you think that we will open "the main door" to bots and bad crawler to our sites?
For sure, we get better rank but problably our security could be more vulnerable for them.
Thanks!
Block /wp-admin/ if you want. But you can't change the fact that all WordPress sites have that folder and anybody with malicious intentions know it's there and what it's for. If it's not already a problem, you don't need to worry about it.
Remember robots.txt is NOT a security tool. It is a traffic signal, only good bots pay attention to it.
The bots that would try and do something nefarious with anything in a wp-admin directory are not the ones that honor robots.txt anyway.
From my stand point its not a kind of ranking factor directly. However, user expereince which leads off course to ranking.
Thanks Jennifer for this handy how to,
"I suspect they'll successfully see some of them unblocked." yes I'm the boat. Unlocking them>
Another example of SEO-types trying to game the system to benefit themselves at a cost to all of us. When will they learn? When will they stop fiddling?
They never will. Shortcuts and cutting corners are ingrained in their mentality. Avoid them like the plague.