Searchmetrics recently launched their yearly Ranking Factors Study that bases numbers on rank correlation and averages of top 10 SEO rankings, and this year's analysis shows that content on top-performing sites is much more holistic and less keyword-focused.
Everybody talks about how "content is king." People are advised to "create quality content for users," and not ever since keyword (not provided), some have said "the keyword is dead." Though these phrases may convey somehow understandable approaches, they are often nothing more than empty clichés leaving webmasters alone with without any further information.
Making relevant content measurable
What is quality content? How can I create relevant content for my users? Should I still place the keyword in the title or use it seven times in the content?
To understand how search engines develop over time and what kind of features increase or decrease in prevalence and importance, we analyze the top 30 ranking sites for over 10,000 keywords (approximately 300,000 URLs) each year. The full study with all 100 pages of details is downloadable here.
In a nutshell: To what extent have Panda, Penguin, and not least Hummingbird influenced the algorithm and therefore the search results?
Before we get into detail, let me—as a matter of course—point out the fact that correlation does not imply causation. You can find some more comprehensive information, as well as an introduction and explanation of what a correlation is, here. That is why we took two approaches:
- Correlation of Top 30 = Differences between URLs within SERP 1 to 3
- Averages = Appearance and/or extent of certain factors per position
The "Fall" of the Keyword?
Most keyword factors are declining. This is one of the major findings of our studies over the years. Let me give you an example:
The decrease of the features "Keyword in URL" and "Keyword in Domain" is one of the more obvious findings of our analyses. You can clearly see the declining correlation from 2012 to 2014. Let's have a look at some more on-page keyword factors:
What you see here as well are very low correlations. In other words: With regard to these features, there are no huge differences between URLs ranking on positions from one to thirty. But there is more than that. It is also important to have a look at the averages here:
Explanation: X-Axis: Google Position from one to 30 / Y-Axis: Average share of URLs having keyword in H1/H2 (0.10 = 10%). Please note that we have modified the crawling of these features. It is more exact now. This is why last year's values are likely to be actually even a bit higher than given here. However, you can see that relatively few sites actually have the keywords in their headings. In fact, only about 10% of the URLs in positions 1-30 have the keyword in h2s; 15% have them in h1s. And the trend also is negative.
By the way: What you see in positions 1-2 is what we call the "Brand Factor." It is often a big brand ranking on these positions, and most of them differ from the rest of the SERPs when it comes to classic SEO measures.
Actually, taking only correlation into consideration can sometimes lead to a false conclusion. Let me show you what I mean with the following example:
The correlation for the feature "% Backlinks with Keyword" has considerably increased from 2013 to 2014. But the conclusion: "Hey cool, I will immediately do link building and tell the people to put the keyword I want to rank for in the anchor text!" would be a shot in the dark. A glance at the averages tells you why:
In fact, the average share of links featuring the keyword in the anchor text has declined from 2013 to 2014 (from ~40% to ~27). But what you see is a falling graph in 2014 which is why the correlation is more positive with regard to better rankings. That means: the better the position of a URL is, the higher the share of backlinks that contain the keyword (on average). On average, this share continuously decreases with each position. In contrast to last year's curve, this results in the calculation of a high(er) positive correlation.
Conclusion: The keyword as such seems to continue losing influence over time as Google becomes better and better at evaluating other factors. But what kind of factors are these?
The "rise" of content
Co-occurrence evaluations of keywords and relevant terms is something we've been focusing on this past year, as we've seen high shifts in rankings based on these. I won't go into much detail here, as this would go beyond the scope of this blog post, but what we can say is that after conducting word co-occurrence analyses, we found that Proof and Relevant keywords played a major role in the quality and content of rankings. Proof Terms are words that are strongly related to the primary keyword and highly likely to appear at the same time. Relevant Terms are not as closely related to the main keyword, yet are still likely to appear in the same context (or as a part of a subtopic). These kinds of approaches are based on semantics and context. For example, it is very likely that the word "car" is relevant in a text in which the word "bumper" occurs, while the same is not true for the term "refrigerator."
Proof and relevant terms to define and analyze topics
Let's have a look at an example analysis for Proof and Relevant Terms regarding the keyword "apple watch," done with the Content Optimization section of the Searchmetrics Suite:
The number behind the bar describes the average appearance of the word in a text dealing with the topic, the bar length mirrors the respective weighting (x-axis, bottom) and is calculated based on the term's semantic closeness to the main keyword. Terms marked with green hooked bubbles are the 10 most important words, based on a mixed calculation of appearance and semantic weighting (and some further parameters).
As you can see, the terms "iphone" and "time" are marked as highly important Proof Terms, and "iwatch" is very likely to appear in the context of the main keyword "apple phone" as well. Note that simply reading the list without knowing the main keyword gives you an idea of the text's main topic.
The above chart shows an excerpt from the list of Relevant Terms. Note that both the semantic weighting and the appearance of these terms is somewhat lower than in the previous chart. In contrast to the Proof Terms list, you won't know the exact focus of the text just looking at these Relevant Terms, but you might probably get an idea of what its rough topic might be.
Content features on the rise
By the way, the length of content also continues to increase. Furthermore, high-ranking content is written in a way that is easier for the average person to read, and is often enriched by other media, such as images or video. This is shown in the following charts:
Shown here is the average text length in characters per position, in both 2014 and 2013. You can see that content is much longer on each and every position among the top 30 (on average) in 2014. (Note the "Brand Factor" at the first position(s) again.)
And here is the average readability of texts per position based on the Flesch score ranging from 0 (very difficult) to 100 (very easy):
The Flesch score is given on the y-axis. You can see that there is a rather positive correlation with URLs on higher positions featuring, on average, easier-to-read texts.
But just creating more (or easier) content does not positively influence rankings. It's about developing relevant and comprehensive content for users dealing with more than just one aspect of a certain topic. The findings support the idea that search engines are moving away from focusing on single keywords to analyzing so-called "content clusters" – individual subjects or topic areas that are based around keywords and a variety of related terms.
Stop doing "checklist SEO"
So, please stop these outdated "Checklist-SEO" practices which are still overused in the market from my perspective. It's not about optimizing keywords for search engines. It's about optimizing the search experience for the user. Let me show you this with another graphic:
On the left, we have the "old SEO paradigm: 1 Keyword (maybe some keyword variations. we all know the " An SEO walks into a bar joke") = 1 Landing Page – Checklist SEO. That's why, in the past, many websites had single landing pages for each specific keyword (and those pages were very likely to bear near-duplicate content). Imagine a website dealing with a specific car having single landing pages for each and every single car part: "x motor," "x seats," "x front shield," "x head lamps," etc. This does not make sense in most cases. But this is how SEO used to be (and I must admit: the pages ranked!).
But, to have success in the long term, it's the content (or better, the topic) that matters, not the single keyword. That is why landing pages should be focused on comprehensive topics: 1 Landing Page = 1 Topic. To stick with the example: Put the descriptions of all the car parts on one page.
Decreasing diversity in SERPs since the Hummingbird update
How these developments actually influences the SERPs can be seen in the impact of Google's Hummingbird. The algorithm refactoring means the search engine now has a better understanding of the intent and meaning of searches which improves its ability to deliver relevant content in search results. This means search engine optimization is increasingly a holistic discipline. It's not enough to optimize and rank for one relevant keyword – content must now be relevant to the topic and include several related terms. This helps a page to rank for several terms and creates an improved user experience at the same time.
In a recent analysis on Hummingbird, we found that the diversity in search results is actually decreasing. This means, fewer URLs rank for semantically similar ("near-identic") yet different keywords. Most of you know that not long ago there were often completely different search results for keyword pairs like "bang haircuts" and "hairstyles with bangs" which have quite a bit of overlap in meaning. Now, as it turns out, SERPs for these kinds of keywords are getting more and more identic. Here are two SERPs, one for the query "rice dish," and one for the query "rice recipe," shown both before and after Hummingbird, as examples:
SERPs pre-Hummingbird
SERPs post-Hummingbird
At a glance: The most important ranking factors
To get an insight of what some of the more important ranking factors are, we have developed an infographic adding evaluations (based on averages and interpretations) in bubble form to the well-known correlation bar chart. Again, you see the prominence of content factors (given in blue). (Click/tap for a full-size image.)
The more important factors are given on the left side. Arrows (both on the bubbles and the bars) show the trend in comparison to last year's analysis. On the left side also, the size of the bubbles represents a graphic element based on the interpretation of how important the respective factor might probably be. Please note that the averages given in this chart are based on the top 10 only. We condensed the pool of URLs to SERP 1 to investigate their secrets of ranking on page 1, without having this data influenced by the URLs ranking from 11 to 30.
Good content generates better user signals
What you also notice is the prominent appearance of the factors given in purple. This year we have included user features such as bounce rate (on a keyword level), as well as correlating user signals with rankings. We were able to analyze thousands of GWT accounts in order to avoid a skewed version of the data. Having access to large data sets has also allowed us to see when major shifts occur.
You'll notice that click through rate is one of the biggest factors that we've noticed in this year's study, coming in at .67%. Average time on site within the top 10 is 101 seconds, while bounce rate is only 37%.
Conclusion: What should I be working on?
Brands are maturing in their approach to SEO. However, the number one factor is still relevant page content. This is the same for big brands and small businesses alike. Make sure that the content is designed for the user and relevant in your appropriate niche.
If you're interested in learning how SEO developed and how to stay ahead of your competition, just download the study here. Within the study you'll find many more aspects of potential ranking factors that are covered in this article.
So, don't build landing pages for single keywords. And don't build landing pages for search engines, either. Focus on topics related to your website/content/niche/product and try to write the best content for these topics and subtopics. Create landing pages dealing with several, interdependent aspects of main topics and write comprehensive texts using semantically closely related terms. This is how you can optimize the user experience as well as your rankings – for more than even the focus keyword – at the same time!
What do you think of this data? Have you seen similar types of results with the companies that you work with? Let us know your feedback in the comments below.
First of all, thanks for sharing and breaking down all this data. It's really neat to see some of the more advanced measuring techniques and the results they create.
"Pages for keyword(s) vs. Pages for topics" - The data supporting the topical approach is awesome to see. We've all known this for awhile now, especially post-hummingbird, but having Searchmetrics data back that approach 100% is great.
I've been working extensively with a small client now who built out their website entirely on the keyword per page approach. Given it's a small site with little "brand power," crawl budget is also an issue. As we've added more complex, long(er)-form content, page consolidation (based on similar topics) has been one way we've addressed all those factors at once.
I was confident in this approach, but pumped to see your data directly support that decision. Sweet. Thank you.
As far as the infographic… - I'm shocked to see so many yellow-orange bars (social-related factors) at or near the top. Facebook seems to have quite a bit of "power" within these search-related factors. I'm curious though: were these pages that had been "liked" and shared on Facebook, or was it the following behind the brand Facebook page that was in SERPs?
Really not surprised Google +1s are so high, but more support for the odd social platform. :)
No-follow links still count - Love to see these links still carry weight. And given Hummingbird's topical understanding improvement, why wouldn't they?! I don't know about other SEOs, but I'm just as pleased when a client scores a relevant no-follow link if it's going to bring exposure and traffic. But having some confirmation they carry weight in ranking factors? It's the cherry on top.
Thoroughly enjoyed seeing such complex, sophisticated SEO-related data. Certainly a change from most of the content out there right now. Thanks for sharing!
I think that Facebook factor is more of a correlation if anything, as we all know that Facebook closes most of it's information away from the major search engines.
Remember if it's shared a lot via Facebook, more than likely people are linking too it and it's quality content.
Again a great example about Google wants in "his" Internet: the final user is in charge. The user searches topics, not keywords, that's why Google works in a search more "natural", with a more human language. This is why the social repercusion and the utility content for the user rise against the "technical" SEO (although still is very important).
Great analisys Marcus of this data!
"But just creating more (or easier) content does not positively influence rankings. It's about developing relevant and comprehensive content for users dealing with more than just one aspect of a certain topic." This is where many people go askew. I often find myself telling clients that creating a low quality blog post once a week is far, far less valuable than collecting details on a single subject for a month and putting something "awesome" out for readers to consume. Incredible detail here, but would have liked to seen more info on how diversified media (ie, content with video or video + multiple images) compares to content that is text only.
I remember talking to a Googler a few years ago about SEO, he said "Start doing SEO work with an attitude like you are not doing SEO". At the time I was a bit dumbfounded by his statement yet as the years progress I can see merit in what his statement how much things have changed in recent times. The whole SEO Checklist thing is out dated I 100% agree. Its kind of like people who obsess over 1 or 2 keywords and only want to rank for those. But overall good analysis I will share it on a few sites.
Great post Marcus, I always enjoy listening to your ranking factors presentations at the conferences I've attended and I share pieces with my team hoping that they get a better understanding of why we should/shouldn't do things as SEO evolves.
Cheers!
Great post Marcus Tober, i really appreciate your research on keyword performance, Thank you for teaching me how the Google work, it will surely help me,, and make my work more better,,
I find the length of content to be a really interesting ranking factor. Th seems to be directly in contrast to creating content that is too the point and optimized for mobile. So as more and more people are using mobile content should be optimized for mobile which would imply fewer sentences in smaller blocks of content. But if longer blocks of content rank higher then you should have longer blocks and more content which mobile users probably won't like.
I think often times people are still hung up on the fact the keywords are the most important thing they should be thinking about when crafting an SEO strategy. They are an ingredient in the soup and the soup has a lot of other elements that make up the overall flavor much like an SEO campaign.
"But, to have success in the long term, it's the content (or better, the topic) that matters, not the single keyword."
This is something I had to reiterate last week with one of my clients! No matter how many times you force the word "banana" into your content, no one is going to really believe that your page is about "bananas." The contextual relevancy is what matters, not how many times you can use an exact keyword.
Google handles the millimeter every word and search; since we started with the first letter google already has in mind you can or you will find, in fact associating words and accompanied by terms, the question of adjusting to needs (search), google wants and manages the internet according thought human, very good article, very interesting your
Just read the full 83 page report.
Such great insight.
Wow.....just wow!
Nice tips for good and safe SEO. I have noticed that it’s even easier now to rank with blog commenting. Keep sharing like this.
You desperately need a new infograph maker or better image compression. I'm on a retina display and can't make out the blurry images you posted.
thank you for the analysis and paper. You highlight some interesting points and also some recurring issues with SEO. If you 'experts' or prominent SEO people are polled, the results are susceptible to a "flavor of the month" bias which has happened many times.
Alternatively, if you analyze the top ranking sites you don't know the causation, as you say.
Is it: [some factors] -> higher rankings -> more traffic -> facebook likes/shares ?
Or: facebooks like/shares -> higher rankings -> more traffic ?
hmmm... could be either!
Great post. It's not particularly shocking that link diversity is decreasing, especially as Google gets better at semantic search and structured data turns more of the web into entities (the sameAs tag and the rest of Schema markup in particular should continue reducing link diversity).
Hi Marcus Tober,
Great analysis on ranking matrix but i have query regarding my website which is product based website (https://www.somanyceramics.com) how do i target topics in this website because our important pages are contain products only.
Kindly suggest, how to add relevant data to website for improvement of keyword ranking
[link removed by editor]
Hi there! This is a good question to bring to the Q&A section of our site. We try to keep the comments on blog posts relevant to what the author discussed. Thanks! =)
Thanks so much for the post Marcus. Eye opening for sure. The creation of content for the masses aimed to engage everyone on an easy to understand conversational level... who'd have thought. You'd of thought it logic to create a site in that way to begin with... What about the social factors on your own site - comments - as well as the posting frequency? How do those weigh?
Don't you think Google is trying to dominate the world? Look at the SE Ranking Factors - 1st one is Google+. Why? What's wrong with other social media sites? This means that if you keep promoting Google products, you will be the star otherwise not :(
And now I have to translate it and post on my blog. Thanks.
Thanks for Sharing Research Document After with Proven case study! ...keep up the good work
Admin: Blogger Planet
In the "The Fall of the keyword" section the spearman factor for keyword in URL and keyword in domain name is negative in 2014. Does this mean that having the keyword in either of the two will affect rankings negatively?
Experience dictates that content oriented following keywords are more interesting ... anyway, that's my point of view.However, your illustrations speak for themselves.
Great article .. Things not strings uh? The one landing page for a broad topic gave me an idea anyway .. thanks a lot for sharing your thoughts and searchmetrics study!
This is a great article. It's very encouraging to see that google is increasingly finding ways to place the high quality content at the top of searches. Definitely will be keeping this article in mind moving foward!
Very good read.
Great article but in my opinion Ruben hit the nail on the head 'what Google wants'. The old ranking factors still matter, they will no doubt change as outlined in the post over time but a quick look at the sites ranking on page 1 for whatever search phrase shows they do not apply in the majority of the cases today
Great article Marcus. So if all of my content has been based on keyword research and including key terms to date, could anybody recommend where to go from here?
Should my content just mention a key word once or twice for example, and the rest be discussing the product/products with no structure, word requirements etc? Any assistance would be much appreciated!
keyword research remain very important. The keyword is the start to build your topics. A topic is a cluster of keywords that have the same user intention in common. Once you have these keywords -> the topics you have to think about how you can split parts of the topic in different sections of your text, like a Wikipedia article that is well-structured. The length of the article depend on the user intention too. If you think about writing a text about a complicated topic and most of the articles are long and well-explained, then you have to do that too. User expectation is important. Once you use these simple steps and follow the other instructions in my post, you start writing articles that don't just have keywords included. They have structure and are made for user intention. :)
Yes, keyword research remains very important....so you can figure out exactly what anchor text you need to build links with :-p. I'm pleased at Google's progress in terms of understanding topics instead of keywords, but a bit baffled that their reliance on anchor text remains as strong as it does.
At the end of the day you just need how to figure out what is in Googles head. They want to provide the best relevant information to the web visitor, without having to manually check 100's of Billions of webpages. I really miss DMOZ when it was operating above board instead of the corrupt mess it turned into
... I digress..
"content clusters" I call this "website themes" and it is important. .. "It is often a big brand ranking on these positions" That is really too bad that these larger brands are given an extra ranking benefit. There use to be a time where company size and brand didn't matter, It was all about your knowledge and ranking ability. It is fun Beating those big brands.
So 6/10 of the top 10 ranking factors are social signals? hmm not sure about that cause and effect conclusion. Especially when large well funded companies take up the top spots
.... Coffee
The correlation values doesn't represent the strength as a ranking factor. I know looks like the strength but it isn't. We built a nice explanation what a ranking factor is:
https://www.searchmetrics.com/en/what-is-a-ranking-factor/
Thanks for the link Marcus. I will take a look
thank you Marcus for a fabulous post and again what we should be writing for is our 'clients to be'...who do we want to attract...it makes all the difference and semantic search is truly here.
Fantastic sharing Marcus, Good Content Click Through Rate, relevant terms and back-links are the most important ranking factors. The ranking factors chart describing all the things.
All are more than fantastic Marcus!
As I checked attached screenshot of ranking factors 2014, I noticed 'URL in no Subdomain' is 0.0. What that mean? Sorry, but I couldn't understand. You mean sub-domain websites has to make affords for rank higher than regular 'www.example.com/abc' website?
However, I can see that it is very minor signal. But if you can put your input on that, it will great.
Thanks,
what if somebody will spam my website pinkcakeland.com to down my website ? Any spammer or competitor can buy millions links on Fiverr , links to my website , after 1 month my website will banned.How Google know that links was bought by me but not competitors.Anybody can make bad SEO.How protect against?
[link removed by editor]
Hi Dariusz, and welcome to Moz! We'd like to ask that you keep your comments on the blog posts relevant to what the author discussed -- Marcus didn't talk about negative SEO at all here, so your questions look out of place.
If you'd like to learn more about how to protect against negative SEO (and whether you should be worried about it), Marie Haynes wrote a great post not long ago -- I'd recommend reading it!
Thanks for sharing this post. Great to see such an elaborated explanations on this topic! With the introduction of Hummingbird, it's all about relevance and providing value to your visitors.
Interesting post, thanks for sharing.
Hi Marcus,
Great analysis. Thanks for sharing.
I cannot understand what does the factor "Tweets" means. It surely cannot be number of tweets. Number of followers maybe? Similar ambiguity for Pinterest.
Very informative content here. I would have to agree that keywords are no longer relevant if you are not able to produce good and quality content from it. I would have to say that keywords are still there, they are still relevant only if we are able to get good content which is not solely for search engines purposes only BUT more importantly for our target market. Cheers!
Thanks Marcus Tober for posting your research and comparison data between 2013-2014. it show that Google change more then time .....and i afraid about my SEO off page activities ....and now SEO is very easy and very tough.
NOW DAYS ! Information (Content) IS King of SEO.
But i think we will see in future negligible distance between End user and Google Search Engine
That's great Marcus! Very nicely done again. Data made appealing :)
My take, it got really interesting after hummingbird. But pretty much the same things except for google+ as I really cannot get any definitive data for it. I just can't "prove" to myself that a page ranked solely because of G+1's.
Sharing this out. Thanks for the great infographic and study!
Hi Dennis, I fear that your phrase "I just can't "prove" to myself that a page ranked solely because of G+1's" is underlining one misunderstanding.
Without desiring to put my words in the mouth of Marcus, what we should always keep very clear in our minds is that all "Ranking Factors Studies" are not telling us what things Google (or Bing) consider as real algorithmic ranking factor, but what characteristcs the sites that are ranking well have and do not have, therefore the possible correlations between those "factors" and the rankings.
Google Plus Ones, then, are a very common characteristic those sites have, hence - with correlation analysis - we may say that "sites with more +ones rank better than sites with less or without +ones". +Ones are not a direct cause of good rankings, but they may help in obtaining them as any other social share:
Social shares > higher visibility > creation of 2nd tier backlinks (i.e.: on Topsy) and improved opportunities of earning natural backlinks from people who discovered that shared content.
In the specific case of +ones, we may eventually further dig into the topic and trying to see if Google may give them a special fucntion, but that investigation would be more into the field of SEO pure theory than on-the-trench-SEO, because we would be talking about patents et al, and we know how not everything patented is finally developed into something concrete.
Ranking Factors studies like the Searchmetrics and Moz ones should be taken more as the most authoritative "checklist" of best practices, and used as inspiration for analyzing our SEO strategy and, if it was the case, change it.
Never we should take them as a magic bullet list of things to do.
This is hard to explain to people not entrenched in SEO as to almost everyone this looks like a list of the most relevant factors for ranking.
Just as a side thought have you done any testing with Google + buttons?
Whilst doing a bit of experimenting I tried using the google + button on pages and observing the impact based on the number of shares, this is incredibly subjective but I noticed that google seems to almost instantly index a page "correctly" when it has a google + share.
The index-velocity of Google Plus is quite notorius :D... In fact I know of many SEOs, who use G+ just for that.
Classic link dumping.... In fact i know many SEOs, who used Twitter and other social networks for that.
Not going to lie, the only reason I jumped on the google+ bandwagon (couple of years ago now) was because of the rumored effects it had on SEO.
I find this extremely interesting - especially given that we have made so many correlations between G+ and rankings, to the extent they are featured on MozCon etc. etc. Every SEO should build their G+ empire sort of thinking...
However if recent developments are any indication, it seems Google is looking to fold up G+...or not?
This is interesting, thanks.
It is important to understand that the basic parameter for page creation has changed from keyword to in-depth content, research and useful insights. Thanks for sharing the infographic.
HI, Trevor!
Thanks for Sharing Research Document After with Proven case study!
Marcus Tober was the author of the study and the post... not Trevor. =)
Great post Marcus Tober! (Érdemes Mindenkinek átnézni ezt a jo összefoglalót. Mi van ezekre a fő hangsúlyokra figyelünk elemzéseink soran.)
¿?... I mean, a spam link is a spam link even if written in Hungarian... :D
It is good for everyone to review this summary. What is the main focus of these analyzes are observed during the xxx.hu website. Coming soon is the free trial < This is the translation of the comment above
Hi Szilvia! Please keep comments both relevant to the content of the post and in English -- the vast majority of our community (~80%) has its browsers set to English, and we want to make sure they can all read what you have to say.
Thanks!
Hi Trevor! Hungarian me my community, so I know to link to this page in english. Many do not speak English, I know little. However, the tables and almost everyone understands that this is important. Link deleted, sorry