One of the first things to figure out for an SEO campaign is who your site is competing against in the SERPs. If you are consulting for a client, just asking what they know about their competitors might not be enough. Many clients will have a good handle on who their business competitors are, but these may differ substantially from the sites ranking for competitive terms. If you are running your own site, or are doing in-house SEO, you’ve probably spent a good deal of time getting comfortable with how strong the site is and for which terms it is ranking well. Either way, a more systematic approach to this preliminary competitive research can do wonders for prioritizing your SEO efforts.
Why Competitive Research?
It is true that competitive research can provide a survey of competitive link building tactics, many of which may be replicable for your own site. But more importantly, competitive research can show you which of your potential strategies is most likely to provide your site unique value, value that your competitors will probably have a harder time getting or which they seem to have neglected so far. For more on the nitty-gritty of competitive research, check Justin Briggs' guide to competitive backlink analysis.
A First Look At The SERPs
We all know what we’re here for: rankings! traffic! And rank is directly related to click through rate. We've all seen the pretty charts from place like Eyetools:
These eye-tracking studies are great for measuring usability, and they make really pretty pictures. But for what we're trying to accomplish here, some more concrete numbers will be useful.
In Kate Morris' blog post about predicting site traffic she cited a source of click-through rates from Chitika which I still use. There are probably more recent studies now; if you have numbers you trust more feel free to use them instead. The click-through percentage drops off very steeply as rank increases—note that the first position has about twice the click-through rate of the second:
It's easy enough to see which sites are in the SERP and which rank higher or lower than your own site—just load up the search page! But often a site will have multiple pages listed in the SERP. So, on a per-search-term basis, I like to add these together per-site. Take the following SERP, which I put in a spreadsheet to make it easier to work with (check out Tom Critchlow's post if you'd like to speed up your own result scraping). The search is "the clash guitar tabs":
So I would say to myself at this point that www.ultimateguitar.com is receiving 51.31% of the traffic for this search query (or 62.73% if I wanted to include tabs.ultimateguitar.com in the figure). On the other hand, www.guitaretab.com, though occupying three places in the SERP as well, is receiving 18.75% of the traffic. Simple enough?
Taking It To The Next Level
This is all very straightforward so far, but also intuitive, simple, and not exceptionally useful. But...
...what if, instead of restricting myself to a single SERP, I was to aggregate data from multiple searches and sum the click-through rate for each domain across these searches? Searches for "the clash guitar tabs" and "pink floyd guitar tabs" are listed below, one atop the other. I've highlighted www.ultimate-guitar.com and www.guitaretab.com for reference:
Using the magic of pivot tables I can then sum these values per-domain (if you need a pivot table refreshed, check out Mike's Excel for SEO guide):
The most powerful domains rise to the top of this list quickly. This is, of course, a very small data set. It is also a market that a few sites have dominated. If you want an interesting data set to practice this method with, try a market with many different brands ("vacuum tubes" works well—"svetlana vacuum tubes", "groove vacuum tubes", "ehx vacuum tubes" and so forth).
Get Creative
Once you've collected ranking data, you can organize it in any number of creative ways to navigate the data more intuitively—and hopefully make the data more actionable. Here is one of my favorite pivot tables, which shows how much strength each domain is receiving from results in each position (rank 1-10) in the SERPs:
This makes it easy to see which sites aren't meeting a certain threshold (e.g. never rank above position five), even though they show up in the SERPs frequently. You can also limit the list of sites in question to those with at least one page in the first page of search results.
Where Do I Go From Here?
There are many ways to tweak this process. You could use only results for hight-traffic terms, or only for long-tail terms. You could throw in a representative sample of both. I also like to get the standard devation from the average for each domain and set a threshold (e.g. any site greater than 2 SDs above the average is a competitor worth looking in to).
I’m sure there will be criticisms of this method based on my disregarding accurate search traffic numbers in my assessment. I know that the more perfect solution would seek out such figures, or other ways of assessing quantitatively how important individual search terms are. In fact, traffic aside, click-through rates can vary widely based on other factors like stacked results, mixed search results with maps and videos and images, and so forth. There is a trade off to be made, though, between the time this process takes and the power of the data it provides.
When I’m looking for a quick overview of the competitive landscape, I want my method to work fast so I can start digging into competitor’s practices and backlink profiles. Does it really matter exactly where my competitors stand against each other? Isn’t it enough to be able to find the top five or ten competitors quickly? I recommend finding the balance of detail and time, that you are comfortable with. A solid ROI, if you will.
"I’m sure there will be criticisms of this method based on my disregarding accurate search traffic numbers in my assessment. I know that the more perfect solution would seek out such figures, or other ways of assessing quantitatively how important individual search terms are. In fact, traffic aside, click-through rates can vary widely based on other factors like stacked results, mixed search results with maps and videos and images, and so forth. There is a trade off to be made, though, between the time this process takes and the power of the data it provides."
Power of highly corrupted data?...............Is there any need to wrestle with Excel here? I don't think so.
As you know, chitika research didn't take into account blended search results, type of search query, local business listing, different industries and branded search listing (like amazon) which dramatically skew the results. So CTR calculations are highly inaccurate. Inaccurate to the point that you can't take the competitive research to the next level like summing up 'CTR' and determining powerful domains on its basis.
IMHO determining top seo competitors is much less complicated. The one which ranks in top 10 of search results for majority of your primary keywords is your true seo competitor. I would rather use PA/DA metrics than CTR to determine the ranking potential of a domain. If i never need to take the CTR data for calculation purpose it will only be from Google Webmaster tools.
We have also found that identifying competitors by positions over time and strategies implemented provides us the most insight in terms of competitive seo analysis.
Thanks for raising these points, Himanshu.
As you say, these CTR numbers are never going to be truly accurate. And as I replied to Ian below, I would never report them in aggregate as is. But really all I am doing is looking at which domains rank in the top 10 for important keywords; I just decided to add an arbitrary weighting which would make it a bit more important to rank #1 as opposed to #10. Once I had a shortlist, DA/PA et cetera would absolutely be my next stop.
The important thing for me to remember, though, is that we use DA/PA and other metrics in lieu of ranking data. On the one hand, we can't know how well a domain will rank for every possible search term, so we use DA to estimate which competitors it is likely to beat. On the other hand, if we have a SERP in front of us DA is inconsequential; only rank matters.
That's why, when possible, I like to start out by looking at a few SERPs and branching out from there as I learn more about the competitors in an industry. That's just my style.
Cheers,
Ben
I have to agree with seo-himanshu: the SERPs have evolved, and they're showing no signs of stopping any time soon.
Hi Ben,
One important question: How does providing the total of the CTR % show anything about the domain?
You're adding together all the percentage scores. That could give you some really weird results, and I'm not sure it's a good indicator of competitiveness.
It works if you only use it for a single phrase at a time. But if you start adding together CTRs across different SERPs you're going to really get yourself into trouble. Here's a scenario:
My site gets .11 for one phrase (I'm number 2). That phrase has high consumer intent and high search volume.
Competitor gets .077 for that phrase.
I don't rank for 10 other phrases for which my competitor is #1. But those phrases are low search volume and low consumer intent.
So my competitor may have a score of over 3.4, while I have a measly .11. But I'm far, far better off, and a 'stronger' site in SEO terms.
This is a really good method for single phrases - just not sure it works when you work across SERPs.
Hope this makes sense,
Ian
I hear you, Ian.
It's true that you can manipulate these figures any way you like; you have the power to choose keywords which will generate any outcome.
This data is not something I would report to a client in aggregate, ever. Only per search, and then what is the difference between showing them a list and telling them to Google it and see for themselves?
But on the other hand, say I'm working on a sales proposal for a potential client. I've got about five or ten minutes to dedicate to figuring out what their market looks like. I have their site, I have keywords they are interested in. Aggregating data for these keywords gives me insights fast so that I know which of their competitors' sites I need to get metrics for, see what they are targeting, et cetera.
I hope that clarifies what I am trying to accomplish here.
Thanks,
Ben
And (just a further thought on my comment here) you can always weight for traffic if you so desire.
Yup, I get it.
Potentially I guess you could use (cough) KEI or something as another factor, too.
That's a good point about the variation in search volume. I do see how the final chart has some value, though... If you search variations on a similar term you can definitely see patterns emerge as to how certain sites consistently hit the top and others consistently don't, and this can lead you to think about why these patterns are that way and investiage further.
However, if we assume the click data is spotty anyway, and further complicated by search volume variations, I wouldn't put too much stock in the actual numbers (unless you just like displaying that inner spreadsheet nerd... Totally know the feeling.) Cuz you probably wouldn't be too much worse off just doing the same queries and using a quick 10-point scale to get some of the same patterns. I guess it comes down to how much time you have to generate something like this (doesn't take *that* long, but it's still an opportunity cost) and how valuable you think it will be to you.
Yes, in that "per rank" view with the pretty colors at the end, using CTR numbers becomes less relevant. In fact, I probably only use them there because that was just another pivot table for me manipulating the same source data in Excel (i.e. the numbers were already there).
But this isn't a prescription of methodology for everyone, it's just something to think about. If you have less time, do less. If more, do more. Really, except for Analytics data (most of the time) every metric we use in SEO has a degree of "spotiness", if you will. We all end up finding metrics we are comfortable using.
Is it the best way to analyze CTR, or search-engine traffic to your site or anyone elses? No!
Is it a good way to get a really quick list of who your top competitors are? Yes. Which is what the title claimed.
I got fairly useless data out of SEOmoz for the first couple weeks because when the campaign set-up asked for my competitors, I said, "Uh, er... what?"This post should be included in the campaign set-up FAQ.
wow... great article, Ben... And more than this, thanks for those useful links, especially the one about building agile SEO tools using Google spreadsheets:) I didn't know about that...
I figured out way early that simply asking potential clients who their competition isn't enough. The best way in my opinion is to ask who their biggest competitors are from a search perspective is very important. Making them do some research I find helps them keep their expectations in line.
I've also taken this that step further and include search volume, SERP entry types (places, organic, video, news...), localisation and relevance factors.
My results are what I call traffic estimates. i.e. an estimate of the traffic the website will get from each keyword. Cross referencing back to true traffic does show their is a lot of variation from reality, but it's better than a complete guess!
Then I rank domains based on their traffic estimate to see who the top competitors are.
It's good to identify those big competitors for analysis and provide a ranking so my clients can see how well they are doing with respect to them and time.
I've also automated it so I can process 1,000s of keywords for each client on a monthly basis, which is nice :-)
Quick and dirty competitive analysis. Love it! Thanks for sharing.
Wow what a great breakdown. Gonna bookmark and print this for future use for sure.
Good info based on defining true copetitors.
right no i am usign some seo online tools.
I think that we know that who are our competitors this best for us in SEO and this way you know where we are on which position this all through serp and here all information are really best and I read this it help me in my work.
[link removed]
It's very useful to know excel - some SEOs are afraid their job will be automated, but much more likely is they'll underperform and lose the job anyway. Just learn excel, and VB.
I like your process here and how you estimate click through rates for competitors, but I don't necessarily think you need CTR to determine competitors in the search space. If a site is ranking well for your targeted term, there is your competition in search. Once we identify the websites ranking well, then you can look at their link profiles, social media impact, and on-page content to find their strengths and weaknesses.
Analyzing data from a few competitors at the top of your targeted search terms will help identify potential link partners, authoritative sites in the industry, relevant content, and a host of other metrics. So maybe I am admittedly missing something valuable, but I usually don't calculate click through rates because we don't have accurate data for it - it changes with the inclusion of local results, number of sponsored ads, titles, descriptions, etc as you point out at the end. Your study reproduced what google already told us - the top ranking domains had the highest click through, and just identifying the top ranking sites for the search terms yeilds strikingly similar results.
Now if you could identify bounce rates for sites and plug that in, you could end up seeing wild variations in click through data. For instance, if the top 2 sites are not that great of results, you could see a spike in click through rates for sites ranking below the top two results if people are getting farther down the search page, or it could have an equally negative effect by users trying different search terms and leaving the search before they get to the 3 or 4 spots. There is so much to consider there when you think about it.
Thanks for sharing - your post really got me thinking.
Hi Ben,
Great post! CTR is an important factor and i like how you discussed the different CTR from site that rank within the top 3 versus the rest of page 1!
Thank you!
This is so well done, I can't believe there are only 3 comments. You have done a very good job explaining your process, from start to finish.
Thanks for this, consider it bookmarked as a resource.
The reason why this post works so well is because it takes a simple concept and really drills down to get the most out of it. I can see this being so useful for SEOs working with clients and reports. - Jenni
Great Post - might even tempt me back to using Excel!
Always nice to see quantifiable data in SEO research which I think leads to much better end results, rather than basing on guess work or wooly assumptions.
Thanks. If you need an Excel refresher, be sure to check out Mike's Excel for SEO guide that I link to in the post. He put a lot of love into it, and we all know love is the secret ingredient for Excel magic.
This certainly is a thorough way to go about defining your competition. It's definitely important to figure out who your competitors are and what they are doing, even if it's just a basic overview of their online marketing efforts. This approach might be a little too times consuming for some companies, but definitely worth the effort for those that have the time to do thorough research.
Hi whilst CTR is important for competitive analysis I would rather spend my efforts doign analysis on the on site aspects for each competitiors website.
CTR can vary for each niche and for the title/ meta copy you use, I mean type in SEO MOFO for example and see the meta description lol...
But yeah I like you have gathered this CTR data, very impressive.
I use a similar technique for working out who online competitors are - and using weighted search volumes from AdWords as you've suggested is a good way of making it more realistic (so sites which consistently rank #3 for keywords don't skew data away from the site which ranks #1 for the big money keyword). It's not perfect - and show me an SEO technique for competitive data gathering that is - but it takes me 15 mins to take a basket of keywords and crunch through the data to give a decent stab at the 10 (or 20 or 50) biggest competitors for a client's site online.
Ben, cool topic, well done for going with it.
Now, although I think this is valauble, I wonder whether it wouldn't be more important spending time analysing competitors sites rather than figuring out click through ratio's. I completely understand the purpose of this and it's something I've done myself for several projects I've been involved in, but I always find myself gaining more by spending time looking at competitors sites and their link building tactics, than actually seeing how many clicks they get.
Each to their own of course.
I think this technique is most valuable in the "finding out who your competitors really are" - in my experience, people's perception of who their competitors are online isn't complete. This kind of analysis helps dig out those websites which aren't even on your radar. And who knows, maybe finds some websites which might be useful for other SEO activity later on...
Tim, absolutely, sorry if my comment came out the wrong way, I didn't mean to say this was bad or anything, I just merely wondered about it not being the best thing to spend time on. I do agree, it's a great way of figuring out proper competition, which is something a huge number of SEOs and Wannabee SEOs get wrong..