As an SEO, whether you’re working in-house or handling many clients in an agency, you’ve likely been using this tool for a bunch of reasons. Whether it's diagnosing traffic and position changes or finding opportunities for optimizations and content ideas, Google Search Console's Search Search Analytics has been at the core of most SEOs' toolset.
The scope of this small guide is to give you a few ideas on how to use Search Analytics together with Google Sheets to help you in your SEO work. As with the guide on how to do competitive analysis in Excel, this one is also focused around a tool that I’ve built to help me get the most of Search Analytics: Search Analytics for Sheets.
The problem with the Search Analytics UI
Sorting out and managing data in the Google Search Console Search Analytics web UI in order to get meaningful insights is often difficult to do, and even the CSV downloads don't make it much easier.
The main problem with the Search Analytics UI is grouping.
If you’d like to see a list of all the keywords in Search Analytics and, at the same time, get their corresponding landing pages, you can’t do that. You instead need to filter query-by-query (to see their associated landing pages), or page-by-page (to see their associated queries). And this is just one example.
Basically, with the Search Analytics UI, you can’t do any sort of grouping on a large scale. You have to filter by each keyword, each landing page, each country etc. in order to get the data you need, which would take a LOT of time (and possible a part of your sanity as well).
In comes the API for the save
Almost one year ago (and after quite a bit of pressure from webmasters), Google launched the official API for Search Analytics.
With it, you can do pretty much anything you can do with the web UI, with the added benefit of applying any sort of grouping and/or filtering.
Excited yet?
Imagine you can now have one column filled with keywords, the next column with their corresponding landing pages, then maybe the next one with their corresponding countries or devices, and have impressions, clicks, CTR, and positions for each combination.
Everything in one API call
Query | Page | Country | Device | Clicks | Impressions | CTR | Position |
---|---|---|---|---|---|---|---|
keyword 1 | https://domain.com/us/page/ | usa | DESKTOP | 92 | 2,565 | 3.59% | 7.3 |
keyword 1 | https://domain.com/us/page/ | usa | MOBILE | 51 | 1,122 | 4.55% | 6.2 |
keyword 2 | https://domain.com/gb/ | gbr | DESKTOP | 39 | 342 | 11.4% | 3.8 |
keyword 1 | https://domain.com/au/page/ | aus | DESKTOP | 21 | 55 | 38.18% | 1.7 |
keyword 3 | https://domain.com/us/page/ | usa | MOBILE | 20 | 122 | 16.39% | 3.6 |
Getting the data into Google Sheets
I have traditionally enjoyed using Excel but have since migrated over to Google Sheets due to its cloud nature (which means easier sharing with my co-workers) and expandability via scripts, libraries, and add-ons.
After being heavily inspired by Seer Interactive’s SEO Toolbox (an open-source Google Sheets library that offers some very nice functions for daily SEO tasks), I decided to build a Sheets script that would use the Search Analytics API.
I liked the idea of speeding up and improving my daily monitoring and diagnosing for traffic and ranking changes.
Also, using the API gave me the pretty useful feature of automatically backing up your GSC data once a month. (Before, you needed to do this manually, use a paid Sheets add-on or a Python script.)
Once things started to take shape with the script, I realized I could take this public by publishing it into an add-on.
What is Search Analytics for sheets?
Simply put, Search Analytics for Sheets is a (completely free) Google Sheets add-on that allows you to fetch data from GSC (via its API), grouped and filtered to your liking, and create automated monthly backups.
If your interest is piqued, installing the add-on is fairly simple. Either install it from the Chrome Web Store, or:
- Open a Google spreadsheet
- Go to Add-ons -> Get add-ons
- Search for Search Analytics for Sheets
- Install it (It'll ask you to authorize a bunch of stuff, but you can sleep safe: The add-on has been reviewed by Google and no data is being saved/monitored/used in any other way except grabbing it and putting it in your spreadsheets).
Once that's done, open a spreadsheet where you'd like to use the add-on and:
- Go to Add-ons -> Search Analytics for Sheets -> Open Sidebar
- Authorize it with your GSC account (make sure you’re logged in Sheets with your GSC account, then close the window once it says it was successful)
You’ll only have to do this once per user account, so once you install it, the add-on will be available for all your spreadsheets.
PS: You'll get an error if you don't have any websites verified on your logged in account.
How Search Analytics for Sheets can help you
Next, I’ll give you some examples on what you can use the add-on for, based on how I mainly use it.
Grab information on queries and their associated landing pages
Whether it is to diagnose traffic changes, find content optimization opportunities, or check for appropriate landing pages, getting data on both queries and landing pages at the same time can usually provide instant insights. Other than automated backups, this is by far the feature that I use the most, especially since it’s fairly hard to replicate the process using the standard web UI.
Best of all, it’s quite straightforward to do this and requires only a few clicks:
- Select the website
- Select your preferred date interval (by default it will grab the minimum and maximum dates available in GSC)
- In the Group field, select “Query,” then “Page”
- Click “Request Data”
That’s it.
You’ll now have a new sheet containing a list of queries, their associated landing pages, and information about impressions, clicks, CTR, and position for each query-page pair.
What you do with the data is up to you:
- Check keyword opportunities
Use a sheets filter to only show rows with positions between 10 and 21 (usually second-page results) and see whether landing pages can be further optimized to push those queries to the first page. Maybe work a bit on the title tag, content and internal linking to those pages.
- Diagnose landing page performance
Check position 20+ rows to see whether there’s a mismatch between the query and its landing page. Perhaps you should create more landing pages, or there are pages that target those queries but aren’t accessible by Google.
- Improve CTR
Look closely at position and CTR. Check low-CTR rows with associated high position values and see if there’s any way to improve titles and meta descriptions for those pages (a call-to-action might help), or maybe even add some rich snippets (they’re pretty effective in raising CTR without much work).
- Find out why your traffic dropped
- Had significant changes in traffic? Do two requests (for example, one for the last 30 days and one for the previous 30 days) then use VLOOKUP to compare the data.
- Positions dropped across the board? Time to check GSC for increased 4xx/5xx errors, manual actions, or faulty site or protocol migrations.
- Positions haven’t dropped, but clicks and impressions did? Might be seasonality, time to check year-over-year analytics, Google Trends, Keyword Planner.
- Impressions and positions haven’t dropped, but clicks/CTR did? Manually check those queries, see whether the Google UI has changed (more top ads, featured snippet, AMP carousel, “In the news” box, etc.)
I could go on, but I should probably leave this for a separate post.
Get higher granularity with further grouping and filtering options
Even though I don’t use them as much, the date, country and device groupings let you dive deep into the data, while filtering allows you to fetch specific data to one or more dimensions.
Date grouping creates a new column with the actual day when the impressions, clicks, CTR, and position were recorded. This is particularly useful together with a filter for a specific query, so you can basically have your own rank tracker.
Grouping by country and device lets you understand where your audience is.
Using country grouping will let you know how your site fares internationally, which is of course highly useful if you target users in more than one country.
However, device grouping is probably something you’ll play more with, given the rise in mobile traffic everywhere. Together with query and/or page grouping, this is useful to know how Google ranks your site on desktop and mobile, and where you might need to improve (generally speaking you’ll probably be more interested in mobile rankings here rather than desktop, since those can pinpoint problems with certain pages on your site and their mobile usability).
Filtering is exactly what it sounds like.
Choose between query, page, country and/or device to select specific information to be retrieved. You can add any number of filters; just remember that, for the time being, multiple filters are added cumulatively (all conditions must be met).
Other than the rank tracking example mentioned earlier, filtering can be useful in other situations as well.
If you’re doing a lot of content marketing, perhaps you’ll use the page filter to only retrieve URLs that contain /blog/ (or whatever subdirectory your content is under), while filtering by country is great for international sites, as you might expect.
Just remember one thing: Search Analytics offers a lot of data, but not all the data. They tend to leave out data that is too individual (as in, very few users can be aggregated in that result, such as, for example, long tail queries).
This also means that, the more you group/filter, the less aggregated the data is, and certain information will not be available. That doesn’t mean you shouldn’t use groups and filters; it’s just something to keep in mind when you’re adding up the numbers.
Saving the best for last: Automated Search Analytics backups
This is the feature that got me into building this add-on.
I use GSC data quite a bit, from client reports to comparing data from multiple time periods. Unless you’ve never used GSC/WMT in the past, it’s highly unlikely you don’t know that the data available in Search Analytics only spans about the last 90 days.
While the guys at Google have mentioned that they’re looking into expanding this window, most SEOs have had to rely on various ways of backing up data in order to access it later.
This usually requires either remembering to manually download the data each month, or using a more complicated (but automated) method such as a Python script.
The Search Analytics for Sheets add-on allows you to do this effortlessly.
Just like when requesting data, select the site and set up any grouping and filtering that you’d like to use. I highly recommend using query and page grouping, and maybe country filtering to cut some of the noise.
Then simply enable the backup.
That’s it.The current spreadsheet will host that backup from now on, until you decide to disable it.
What happens now is that once per month (typically on the 3rd day of the month) the backup will run automatically and fetch the data for the previous month into the spreadsheet (each month will have its own sheet).
In case there are delays (sometimes Search Analytics data can be delayed even up to a week), the add-on will re-attempt to run the backup every day until it succeeds.
It'll even keep a log with all backup attempts, and send you an email if you'd like.
It'll also create a separate sheet for monthly aggregated data (the total number of impressions and clicks plus CTR and position data, without any grouping or filtering), so that way you'll be sure you're 'saving' the real overview information as well.
If you’d like more than one backup (either another backup for the same site but with different grouping/filtering options or a new backup for a different site), simply open a new spreadsheet and enable the backup there. You’ll always be able to see a list with all the backups within the “About” tab.
For the moment, only monthly backups are available, though I’m thinking about including a weekly and/or daily option as well. However that might be more complicated, especially in cases where GSC data is delayed.
Going further
I hope you’ll find the tool as useful as I think it is.
There may be some bugs, even though I tried squashing them all (thanks to Russ Jones and Tori Cushing, Barry Schwartz from Search Engine Roundtable, and Cosmin Negrescu from SEOmonitor for helping me test and debug it).
If you do find anything else or have any feature requests, please let me know via the add-on feedback function in Google Sheets or via the form on the official site.
If not, I hope the tool will help you in your day-to-day SEO work as much as it helps me. Looking forward to see more use cases for it in the comments.
PS: The tool doesn't support more than 5,000 rows at the moment; working on getting that improved!
+100 pts! This is beyond amazing, I've been meaning to find a good solution for this problem for ages but haven't had the time to learn how to query the API.
For those who haven't tried dumping data out of Search Console, this will literally save hours every time.
Mihai - I'd love to know if you have more posts on how to use this in creative ways?
Also want to know your opinion or solution to the fact that (not set) has been creeping into our Queries now. (Google Analytics > Search Console > Queries) since Oct 2015. It's now showing 15% for some sites I manage. Is this just something we have to deal with now?
Thanks Joe! Unfortunately, I don't have anything else written about this. There's the official site, but that's mainly to explain what each feature does.
Regarding (not set), that's basically due to the fact that Google filters out some queries that have few impressions or contain sensitive information, to ensure user privacy. They mention this in the Search Analytics Report help page:
To protect user privacy, Search Analytics doesn't show all data. For example, we might not track some queries that are made a very small number of times or those that contain personal or sensitive information.
That happens in Search Console as well, just that those queries are simply not shown (where in GA they're grouped as 'not set').
15% is actually pretty good! I have sites with 70%! It largely depends on the type of website. Content-heavy sites tend to have more queries filtered out due to the large amount of long tail keywords, versus e-commerce sites for example.
Anyway, thanks again, hope you enjoy working with the tool :)
Very well done Mihai! Having a 'Backup Totals' sheet is a nice touch to see performance over time :)
Thanks! Looking forward for any feedback :D
Hi Mihai,
Thanks for this post, i was trying to find tutorials online to use google analytics more propely because i wanted to see if i could find a way to do better SEO (less quantity but more quality).
I have kept your post open cause i can't remember everything but i'm enjoying to play with analytics this way!
You made my monday :D!
Thanks, Camille
That's my goal as well, be able to get to the useful data as fast as possible, to give you more time to get insights / take decisions / make changes.
Mihai! - I always enjoy you questions and contributions to John Mueller's webmaster hangouts and office hours; thanks much for making this add-on and freely offering it to the SEO community via Moz. I'm looking forward to testing this out "in my copious free time" haha.
I"d recently played around a bit with Analytics Edge downloading GSC query/LP data (etc.) and I'm wondering if you've used that as well?--aside from the fact that AE is not free, and your tool also auto-downloads GSC data monthly, if you've played with AE are there any other differences and/or advantages to your add-on that you think are worth calling out / highlighting here?
Thanks again!
I've only used one other add-on that offered similar features, and was only partially free, but it's not the one you mentioned. If you see any features that might be worth adding though, let me know!
I totally agree with David, this post is just awesome, I have to study this :)
Thanks for this great post, playing with the tool right now. Very useful to get search data organized this way and apply what Cyrus Shepard recommended here.
That's a really good tip, thanks!
YES!! This is fantastic. I am not the best spreadsheet person--it's something I need to work on--and I've been looking for a way to leverage data from Search Console for a while now. Being able to have a backup of all the data is also something that I haven't yet done for clients, but I have wanted to start doing.
Thanks Collin, looking forward to any feedback, my aim was to make it as simple and as user-friendly as possible.
Really good insights. The 90 day view is really frustrating, but this offers a fresh and quick way to get around it! I can see how syncing this data with the SEOtoolsforexcel plugin by pulling in GA, SEMrush and Majestic SEO data will give some incredible insights without having to spend hours compiling data.
Is this something you have considered?
Thanks! I have, but I decided to first focus on doing Search Analytics right, then worry about other APIs :)
Thanks for the post and the tool. We have already implemented the spreadsheet and we are getting great insights!!! Simply awesome :)
Sounds awesome! Would love to hear what insights you've used it for!
This addon is really amazing. Yet to test the backup functionality. Definitely makes the task of analysing pages with related queries super easy!
Appreciating the time and energy you put into this post and detailed information you offer. It’s awesome to come across a blog every once in a while that isn’t the same unwanted rehashed material. Great read!.
Glad you enjoyed it, thanks for the kind words!
It's a great content if I knew how to use the API. I got stuck on how to use the Google Api. Is there around a guide? what is the technical work required to make the API work?
Hey Luca!
Using the tool doesn't require you to know the technical aspects of the API, but you can read about each specific feature at https://searchanalyticsforsheets.com.
If you would like to read more about the API though, you can always take a look here: https://developers.google.com/webmaster-tools/v3/s... (it's highly technical though).
Superhelpful post. Was apprehensive about adding yet another chrome extension but look safe and will be probably be quite rewarding too ! Thanks a lot.
Oh my, this tool is SO useful. Totally hit the pain points and solve it with my GSC usage.
Straight forward, easy to use, just totally awesome!
I teach online marketing to local small businesses in Hong Kong. I'm going to be recommending it to everyone.
Thanks Mihai!
Thank you for trying it, do let me know if you hit any issues or have any feedback!
It's really hard and difficult task to find out this kind of creative and valuable information. I shared this post with all my colleagues. It's very helpful and interesting tool.
Hello there,
Thank you for the steps by steps and useful information. But when i tried to grant access to the Add-on, i got this error :
Error: There was an issue showing the sidebar at line 400: Couldn't fetch the list of sites verified in your account, an error occurred at 339: There was a problem retrieving sites from your Search Console account:500, { "error": { "code": 500, "message": null } } . We'll reset the authorization info. (line 162, file "Code", project "Search Analytics for Sheets")
Any idea!!!
Hey! It seems Search Console is currently down, including the API. Try again in a few minutes, it should be back up soon.
This add-on is amazing! Thank you so much!
This tool is amazing. The queries/landing page sync, plus the added filtering - a HUGE time saver. Adding this to the GA API Add-on, and being really familiar with VLOOKUP, is just mindblowing. So - great job!
The only thing I don't see is any way of knowing what the query was that produced any one specific sheet. I was so excited when I loaded this that I ran about 10 different reports, and then looking back I was wondering - OK, now this view did what? Obviously I could name the tabs...
But this is a great tool and a wonderful post. Thank you!
Thanks Fred, that's actually good feedback! Perhaps I could either allow naming the report before generating it, or add some sort of information before the actual data (in the first row or something like that).
The automated back feature is a lifesaver! I keep monthly backups of several domains and subdomains, but the manual backup is time consuming and sometimes I totally forget doing it.
I have more than 60k different search queries listed in a monthly report and it makes Excel quite slow. I wonder how much memory Chrome will consume with Google Sheets like that. :-)
Hey Gyorgy, thanks for using it!
Two issues:
Thanks for letting me know. I think I will just wait for the updated version and the 2 million cell limit should be fine. If I calculate correctly, clicks, impressions, CTR and position of 60k search terms is 300k cells. Will you update this post when the new plugin is ready?
OMG this is awesome... Need to work on this really quickly. Data about landing page are just GOLD !
Thanks for the incredibly well thought out and detailed article. I've been needing something like this for years. I'll update this comment with how I get on.
Great Information it is very useful..Many tips like this required for SEO aspirants, many dont tricks on SEO..
Its amazing, i am sure many folks will endorse me on this. Many of us, always wonder to have historical data, however some time we do but some time we dpn't . This has made it easy. I am sure every one would love to use it and keep it in their regular routine for their data anaylsis purposes. Thanks Mihai for sharing this. :)
Sure thing! The backup feature was the main reason for building this in the first place :)
If something we have to do the SEOs is collecting data, get all the metrics we can get, even the data it´s seem less important as mentions in twitter, as more metrics we collect, more control of SEO will have, in my case i´ve have a spreedsheet where i put all the details from google analitycs, console, moz, and all others apps for seo i know, there are lots, i count how many post blogs or words has blog post of the sites, etc, i´ll do this for me and my competitors. Collect all data that you can, thanks for the post, this make us easier to get this done.
Thank of information. I loved it from backups. You never know if someday owe use.
Hi Mihai
Thank you very much for the information. The truth is sometimes a bit annoying having to manually copy all information (keywords landing pages, traffic ...)
That's what I aim to avoid, let me know if you have any feedback!
Great insights, Mihai, we have also used the google sheets add-on and it helps a lot for automation. Another tool, we have just recently discovered is teacupanalytics.com. It's paid, but has some basic free features to try it out.
Thank you for this great guide. The Google Analytics UI gives me a lot of headache.
I was thinking to switch to another service but was not sure if someone provides data as accurate as Google itself. And also it is for free, so this headache that I am getting is great to cut down costs and to get accurate data.
These tricks will definitely make my life a bit easier :)
Hey Ray, thanks for the kind words. Note though this add-on is for Search Console (particularly Search Analytics), not Google Analytics.
Thank you for pointing it out, Mihai. I actually meant the search console but was talking about Google analytics (since the search console gives stats or analytics as well). Sorta mixed it up in my head haha.
Thank you so much for this very valuable tip! This is going to make my life a lot easier.
Mihai,
Thanks to your post I was able to replicate successfully. I've been looking for a convenient solution to maintain a GSC/WMT record after the 90 day threshold. Pulling the data manually wasn't feasible, which had previously discouraged me from doing so. Fantastic post!
Thanks Paul, glad it works well!
Hi Mihai,
Thanks so much for this. It was time when Google restricted the data range which webmasters could use to save and analyze. Later they came up with API solution, which was again a not so easy option for every SEO person. Is there something like this which can fetch data from GA? No doubt we have search console data available out there as a beta feature. If there us any, please share that too.
Thanks again!!
Hey Amit,
Yes, there is a GA API, and there are some add-ons that work with it (but Search Console information isn't included in the GA API).
I'm trying to use Analytics right now for my reports and I think this is a good guide for me. Thanks for writing an article about this!
Sorry for the self promotion. But if there is anyone who hate excel sheets like me, here is an online tool that helps to do exactly the same thing https://www.reportdash.com/
But I should admit that it does not have the automatic back up functionality, though one could create schedules to periodically mail this report to your inbox in pdf or excel formats.
I understand that there will be a lot of people who love to have this built into excel. So kudos for the work.
This is an Amazing Post. Search analytics add-on was new to me.
You can also use external tools, such as Search Console Helper https://searchconsolehelper.com/ that do all the hard work for you...
Thanks for the info. this is definitly what I need for my SEO audits.
thank you, I want to say why there was a difference of data while monitoring the traffic of my blog kangzaber.com when using google analytic. How histats than you think?
This is amazing. I came across this post when trying to explain the discrepancy between monthly click volume seen in Search Console vs total monthly click volume seen in the .csv that the tool allows you to export.
Using Search Analytics for Sheets gets me much closer to the total volume than I was before, but the numbers still aren't the same. Is there a limit to the amount of data that the API can access? Or should the numbers be identical?
If you go into Google Adwords "keyword planner" and select the first option where it says "select new keywords using a phrase, website or category." Put your domain in the spot where it says "your landing page" and then click "get ideas." From there, export the data into CSV, and there you go! The columns include:
Keyword
Avg. Monthly Searches (exact match only)
Competition
Suggested bid
Impr. share
Organic imp. share
Organic Avg. position
And my favorite: "What URL the Keyword was Extracted From."
You can then go back and evaluate the on page SEO for each page on your site, to ensure the right keywords are being targeted. Or, you can go back and only optimize individual pages for the first time now that you can see the low hanging fruit.
Google goes ahead and scans each page on the site then extracting the specific URL that Google believes most relevant to each keyword. Then you can find which keywords to focus on because right on that same CSV it shows the Adwords data such as "average monthly searches," etc.. I found this method extremely helpful so I wanted to share it. I will be trying the strategies mentioned on this page as well, great recommendations.
Hello Mihai Aperghis, thank You for the great review about the console, I use the search capabilities of the console, but I don't know so much, is a very powerful tool of analysis and its very difficult to master if you don't use other people's knowledge, your article for me was very useful, thank you.
Nice and very important post. Thanks for sharing.
This is great, really useful Mihai - thanks for sharing.
Thanks for the article, very interesting and useful
Thanks for sharing your valuable information with us. Hope we will learn a lot like this one from you. Thank you.
Very nice write up. Will definitely try this out this week.
Very helpful... This is my first time to use this tool. Thank you for sharing. You made my work more accurate
Nice share Mihai! Thanks.
I’m glad you liked this post.Very informative post.. Thanks for sharing..
Thank you very much for sharing this information with us. We learn every day to improve our websites.
Amazing and useful post. Thanks for sharing the knowledge.
I think this is wonderful, so helpful for bloggers.
Would love to know how you use the add-on specifically for a blogging site!
Oh yeah I'm a blogger too, this tool will be SOOO amazing.
This could be used so I can identify which blog posts are 'diamond in the rough' for certain keywords ... meaning hitting at maybe around 8-12.
With these posts, a slight lift is probably easy enough, but also could means multiple times SEO traffic for those posts, and this tool can help me identify immediately which ones they are for these 'opportunities'