Tonight, as I was writing another blog post, I started working on a graphic that probably deserves a post and discussion of its own. Below is my personal opinion of how some of the key factors in Google's ranking system have changed through the years I've been in SEO:
A brief dissection of the criteria:
- Domain Trust/Authority - I think this was a factor that most SEOs did not seriously feel until after the Florida update (November of 2003), after which it skyrocketed into consciousness, crept along for a bit, and over the last couple of years, has become the dominant factor in the success of rankings at Google. That's not to say that things like exact-match domain names + lots of anchor text from diverse root domains can't still overwhelm the occasional page from Wikipedia, Amazon or the BBC, but the preferential treatment has reached new heights.
Just in the last couple days, we've started seeing authority sites like Technorati re-ranking for their tag pages on virtually every SERP they target. Google's "Brand" or "Vince" update also points in this direction, as does the collection of user-data and usage metrics that are potentially being applied or could be leveraged in the future.
_ - Anchor Text in External Links - While this is still a very powerful ranking tactic, it's not the powerhouse it once was, and before ~2004, I really felt that it didn't carry the same power it did for those years afterwards. Today, my belief is that anchor text has come to be regarded much as PageRank was after its dominance in the algorithm - as a technique that SEOs have focused on gaming to such an extent that much of it has become noise, and it's only really valuable when found in conjunction with other positive signals (or at least, this is how Google thinks now, and their algorithm is still moving in transit towards that destination).
_ - On-Page Keyword Usage - There's little doubt that when I initially started doing SEO, even Google was more susceptible to keyword stuffing. Incidentally, I think this gave rise to the myth of keyword density as a ranking factor (or at least, didn't help slow it down). Today, it's not lost all of its ranking power, but it still sits in the middle as an essential element, but one where "more" won't help you. This is in direct contrast to the other elements I've included in this diagram (where more does equal better rankings).
_ - Raw PageRank / Link Juice - In the early days of my SEO career, PageRank was everything (or nearly everything). Manipulating rankings was as easy as getting a few high PageRank links, and this exploit, along with Google's display of PageRank in the toolbar, built industries of link sellers and buyers we still see today. In 2003, PageRank was already on the decline as a ranking factor - a decline that has continued to this day. My feeling is that now PageRank can still make some difference, but it's much more effective for Google as a determining factor for inclusion in the index and comparison against duplicates & scrapers.
BTW - When I say "PageRank" I'm referring to the original, egalitarian concept of links as votes, and the idea that every page and every link passes link juice in a similar fashion to help calculate the raw popularity of URLs in Google's index.
Now I'd really like to hear your thoughts. Where am I right? Wrong? Totally off track? And maybe even a question of where you see these lines heading in the future.
p.s. For more detailed coverage of these factors, see My Personal Opinion - 90% of the Rankings Equation Lies in These 4 Factors (from 2007, but still shockingly relevant)
p.p.s. Yes! We have changed the comments format and the thumbs up/down visuals a bit, so if you have feedback, feel free to leave it here.
Back in 2003 low quality links with targeted anchor text were killer powerful. Within 6 months of getting online (and about 3 or so months of doing SEO) I was ranking in the top 5 for Search Engine Marketing. Back then I had no brand, no awareness, and no money...so almost *all* of the links were low quality spammy manipulative links. And I was killing it with anchor text.
I actually felt like anchor text started to slide in importance more in the 2005 timeframe.
I agree, Aaron... at that point the topic of the page that hosted the link started to become important. I believe that Rands line for link anchor text traces anchor + topic influence.
Great to see you here, Aaron! I think you might be right, as your experience with a greater variety of sites almost certainly exceeded my own at that point. Thanks for the contribution.
Hey Rand - you know it would be really very cool to see V3 of Ranking Factors - do you have anything in the pipeline for a reissue anytime soon?
Yeah! It's time for V3!
I was thinking the same thing. Thanks for beating me to the punch!
ditto, I recall finding v1 very useful when I was starting out. Anything to demystify SEO and dispell the old myths that still stick to the industry can only be a good thing.
I absolutely agree with your post, though I am not so experience as you guys. I've not been into this industry while florida and other updates. The factors and graph shown in this post shows most relevant factor importance I've seen by now.
I see that you've taken only general factors in graphic, but I'll be glab if you can show the graph with sub-factors which falls under these vast factors, you've shown in current graph.
In general from your post & my thinking I can say to dominate SERPs you need "Enough number of backlinks with targeted anchor text from authoritative, relevant & high PR websites."
-DS
On your graph External-Anchor text links must to be split into Quality External-Anchor and Any External-Anchor. Crappy links no longer work like they did (most of the time) - see Aaron's comment. Quality links do - as strong as ever.
This is where I think Trust and Authority directly come into play: the Trust and Authority of the site your External Anchor Text links come from. And this would be a useful additional graph: the importance of different criteria on External Inbound Links (text being just one criteria).
Wikipedia, Amazon or BBC pages get no rankings without some relevant On-Page or External Anchor Text (to the site). None.
So Trust and Authority (and raw Page Rank) are worth nothing without On-Page or Any External Anchor Text.
And unless a site is banned, then Quality External Anchor can deliver results with no other factor. Even On-Page can if the market is uncompetitive enough (it still happens).
And unless a site is banned, then Quality External Anchor can deliver results with no other factor. Even On-Page can if the market is uncompetitive enough (it still happens).
So I see no case for promoting Trust-Authority above Quality External-Anchor.
The future: it's an obvious move for Google to continue to increase the importance of the context of links (Quality) and smart SEOs have for years been building quality links so their sites are prepared for this.
It's also pretty much irrelevant if you think Trust-Authority or (Quality) External Anchor Text are more important because your actions will be the same: build quality links using great content and you'll get both Trust-Authority and Quality Anchor Text.
The graphical display is very useful. Obviously when you say "trust" we're talking about the trust Google places on a site, which may not always be a view held by the average user. I would only say this is a powerful factor for very popular "head" searches. As soon as you get just a little bit down the tail this is less of a factor.
Given the products/services you sell you understandably have a bias against Google's PR, which I think gets a little bit of an unfair bashing here. I'm not saying it is wonderful, just that it deserves a little more respect, even in 2009.
No real mention to a structured internal linking system? Exceptionally important I feel.
Long time reader, first time poster and all that, so be gentle :)
Not wanting Mr Fishkin to get big-headed with all this praise for his graph ...
It's a great visual, but it seems it needs normalising. Notice that the sum of factors is higher at the end than at the start and that sum peaks around 2005. This could represent greater processing in order to establish the relevancy. It could also already be normalised with unshown parameters?
The other thing is that "trust" is surely a hand-wavy shorthand for hidden algorithm details. Perhaps the next graph could be factors affecting the trust for a domain. As this is now the most important factor this graph appears to imply that the transparency of how content can rank well has decreased?
The other thing is that "trust" is surely a hand-wavy shorthand for hidden algorithm details. I hate when SEOs use vague words like "trust" to describe a broad range of ranking factors. The most ironic part of Rand's graph is that it starts and ends with [Trust/Authority of Host Domain] and [Raw PageRank/Link Juice] at extreme opposites on the scale of importance. In reality, these two ranking factors are essentially the same thing.
I agree in most part. I am sure that anchor text value did not start falling off in a major way in 2005, and I still don't think it has fallen off as stated here. IMO what you are seeing is something along the lines of the following....
If in 2006 I had 1000 links pointing to my site with "SEO Pro" anchor text I would get credit for the anchor from nearly every link.
Today if I still have all 1000 links with the same anchor text, Google may only be giving credit to or allowing 300 - 500 of those to pass any real value.
This is because, again IMO, that Google is carving low value links out of the algo such as junk directory listings and such.
Also, Google may be looking for outliers such as a huge influx of links with perfect anchor text. This does not look natural. So if my site was XYZ.com Google should expect to see links with anchor text "XYZ", "SEO Pro", "Professional SEO", and "www.XYZ.com". If the balance is not normal, then it appears to be an outlier and the bulk anchor text from "SEO Pro" may pass very little if any value.
Your last point is really terrific...if Google really wants to spot link manipulators, they should look at the ratio of links with website URLs (e.g. www.xyz.com) and company name (e.g. xyz) as anchor text to total links.
Real humans (including bloggers and professional reviewers) tend to use the company name or main URL as the anchor text.
If Google really wants to spot manipulative links, just treat non-URL/company name links as rel=nofollow!
:-)
You've got it! What would your target market do if they were creating links to your site? Probably link to you with your brand or domain name and occassionally use keyword anchor text. That's natural!
Which begs the question, why does google's algo still rely so much on anchor text as a ranking factor? I'm not convinced it's been all that devalued but as you say the "natural" pattern of linking is not to use keywords in the anchor text but (most likely) around it... Is this just down to 'laziness' (ie it takes less processing power to use the anchor text than the keywords around the link)?
Really interesting stuff Rand.
If there's less reliance on anchor text, would you say there's also been an increased reliance on the context of the link, ie the keywords around the link itself?
Re the authority/trust issue, for those who havent read it already Chris Garret wrote an excellent post recently about growing your google authority - well morth reading.
Ok, I started writing a comment, but it got a bit large, so I turned it into a blog post. My thoughts are available at the Searchlight Digital blog.
The quick version is: domain authority, in one form or another is going to become a stronger and stronger signal, so build your online presence around that. Be the authority on whatever it is you do.
Im surprised to see "raw pagerank/link juice" so low on the chart for 2009, going from dominant to low. Just the other week Rand posted the study on opencube.com ranking for the term "SEO". OpenCube does not have anchor text for the term SEO and does not have on page keyword usage (they only mention seo in the title tag i believe). The reason they were able to climb in the top 10 was linkbuilding (aka link juice/raw pagerank). After stressing inbound links and then showing them so low on the chart im not sure which one to believe.
I agree with what's been said so far.
However, when it comes to niche industries, in my experience it is still relatively easy to rank highly simply using on-page factors alone.
I 100% agree with Mintyman on this one.
I've had several niche clients rank on the first page of SERPs for a term we did zero link building for (optimized pages only).
I agree with most of this. But there are some concerns I have. These are my opinions and i know there are some people who take everything they read as truth. Beware Opinions Below...
PR has almost nothing to do with ranking for a given keyword. Until you get into the high PR sites.
Authority and Trust are established through Anchor text and PR. With a higher emphasis and raw link juice.
Can you show me a trusted site with a low PR?
And an authority site is generally an authority on a particular subject. right? So this authority on the subject is established mainly through incoming anchor text links.
I think Google is just getting smarter about these issues and discounting links for a lot of reaseons. Anchor text is discounted for various reasons as well as link juice. I'd rather not get into axplaining this to much it would take all day.=P
I believe links will retain their value when they are good links. The definition of good and bad links is just going to be changing in the algorithm.
On page keyword usage has to play a role in ranking... just not a very big one.
In conclusion:
Raw PageRank/ Linkjuice + Anchor Text Links = Trust/Authority. Given your site can pass a human review your site will remain a truted/authority site.
Great post. It is all about trust of your domain and user behavior in my opinion - TOSAS, bounce rate, navigation, CTR%, etc.
Do you think you could maybe do a brief post of how to implement the important SEO factors today? That would be a great follow-up.
Thanks for the post!
It's interesting that your PR line + trust / authority line is essentially flat - i.e. the influence of "powerful / trusted" links in the algorithm has (according to this chart) stayed relatively flat but the definition of "powerful / trusted" has changed from just being a PR question to a much more subtle and trust-based question.
Regarding the exact-match anchor text and Aaron's comment above, I think the importance continued (especially in the UK) longer than at least Aaron is pushing for - and especially on moderately competitive SERPs.
Alot to think about there Rand, thanks again :D Wonder what the graph will look like in a few years from now!
Exactly. This graph really breaks it down and shows how dramatically things have changed over the last few years. As serach evolves, change will continue to grow.
Hi Rand, a great summary of trends for the big 'G's algo changes over the years. There's definitely a bias towards authority sites happening as you said, and for one I'm glad of that... means I can really concentrate on creating good content and adding to user experience instead of chasing links!
Rand - I like the graph, and agree w/ the overall trends... BUT WAIT: I scanned through 1/2 of the numerous replies (kudos, but why I normally won't wade through "SEO Blogs").
SergeA and I think it was BWilford were the only 2 that I came across that alluded to what I see as THE major new algorythm element largely lacking in SEO/SEM "news" circles.
When I attended AdTech 2008, Bruce Clay talked up standard SEO in a presentation w/ Google's Aaron D' Souza on the panel. NEVER was real user interaction w/ sites mentioned in ranking factors.
I spoke w/ Aaron afterward, and he confirmed that Google was tracking: 1) return rates to SERPS, 2) Browsing behavior of people logged into Google's numerous apps, 3) Personalized search "votes", which you now see in those arrows you can click when logged in and viewing SERPS.
He said Google would not allow a ranking boost from these algo elements, as too many sophisticated spammers are out there that would abuse it (sure, ok). He did confirm that NEGATIVE results in return rates to the SERPS, tracked user behavior, etc are at play in ranking.
I see this as directly related to TRUST ranking factors, such as brand. If your site has earned traffic, and retains the traffic it gets on a phrase-by-phrase search basis, you earn the right to be in the SERPS. If not, your ranking won't "stick".
I see this in my analysis of (extensive) web traffic, as a hands-on SEO. I'm always surprised that the SEO community hasn't stressed yet stressed visitor engagement metrics as key to ranking success.
I certainly go along with your assessment, Rand. I also think this evolving algorithm strategy is also improving the general relevance of what is provided in answer to a keyword query.
Overlaying this is another layer which is almost becoming more important. It's most evident with the latest version of Google Local Search where it will try to give you local results for dentists or pizza even though you did not mention your location. In other words I'm talking about personalization.
It's almost as if Google is trying to guess the best possible web page to return for you as an individual. That will even take into account how much they know about you from what they have seen of your surfing habits.
I believe this 'overlayer' is vastly more important than how the traditional factors play into the algorithm. They have almost become secondary as the way Google chooses among the possible web pages that were right for you as an individual.
IMHO
I have a reasonably new site(10 weeks) with very little backlinking built yet. Some are authoritative but most are run of the mill. I am seeing 5th place in my local map for SEO. My site is built with very little content on the homepage on purpose. I have a blog with about 10 articles now and I try to publish a minimum of 3 times a week. Google Maps dominates!
Conclusion: Maps + select content + Blog= high ranking
Your return thoughts
Are people making a mistake bundling trust and authority in with "Branding"?
Do actual site analytics play any role? P&G may be the biggest brand in Laundry Detergent. But If a "Laundry Detergent" page by an unknown brand has high engagement, low bounce rates and a high percentage of click throughs compared to a P&G page....wouldn't that page be more trusted? Are not analytics votes?
Good questions. I think there is a large percentage of people that hope and expect to see big brands for broad search queries. Also, Google does have an obligation to investors. Big brands have big PPC budgets and Google reps. If enough big brands with multi million dollar PPC budgets push their frustrations up the ladder, future algo changes might give them an edge. This is the same dilemma that newspapers face and other traditional forms of media. Will Google keep everything black and white? Or will they play that fine line in the name of trust and authority to keep the masses happy enough with the natural search results?
I agree completely with this one - a site with authority may be the big player within it's niche but it doesn't mean they will automatically satisfy the search query - I guess the way to look at is those 4 factors are still all very important and one shouldn't be discounted/overlooked in favour of another - and the smaller players should look to optimise for the longtail if the trusted sites are to rank automatically on the short tail
Rand, I really think your OpenCube.com example reveals and demonstrates actual trends. Trust/Authority is becoming more important, while at the same time Anchor Text is losing on importance. OpenCube.com has very much in-links and almost no one of them has SEO in anchor, but they still got to achieve good rankings for the term SEO.
Very interesting Rand! I agree in general with what you've set out however I have a question for you;
If as you suggest the importance of anchor text is diminishing and that trend continues, what do you think Google could replace this metric with?
I can't see it going the same way as Pagerank; while it has been gamed if Google comes to rely much more heavily on large/established brands it leaves the door open for a return to paid linking...
The real question you need to be asking is, does it need replacing? As far as I'm concerned, it was a rubbish metric in the first place. Instead, I'd look at (and this is what I think Google is starting to do more) what the content of the page with the link on it is, what the content immediately surrounding the link is, and what the page (and site) that the link is pointing to is about.
That way, you still retain the idea of link contextual relevancy, but you make it far harder to manipulate rankings though simple text link buys. Every link would need to have content relevant to the link and the page and site the link is pointing to. Basically, you'd be paying people to write about you, more like a sponsered blog post than text link.
Excellent post and research to produce the graph, I never thought of presenting thing with a graph always trying to word those facts.
This is definetly a great resource.
To me this clearly demonstrate that Google is getting more and more "social" and focus on "human visitors factors" behaviors more than actual "search engine" optimization elements like on-page optimization and the famous PR...
To build trust with the search engines, I believe that we must first build trust with our visitors.
Again awesome graph, and it got me inspired for some future posts!
Nicolas Prudhon
I just wanted to quickly add that doing well with any one of these factors in a niche that just doesn't quite get SEO/SEM can make a huge difference in you rankings allowing you to at least temporarily dominate. Conversely, a niche that really gets SEO/SEM the more of these ranking factors you'll need master, in order to dominate.
Like the others, I agree the graph makes your message much more easily comprehensible at first sight. I'm still relatively new here compared to other veterans (no offence :P), but I didnt know ancher text in external links was more important than on site keyword usage.
You missed the main new criteria that has been added. GrowMap almost touched on it. The social aspect of users is that new item that Google is tracking. Yes, they are slowly working out the arbitrage sites, which will give credance to the domain argument, but that is simply to eliminate crappy sites from the index. The part you're missing is Google's increased weight on personal interaction. This data comes from a number of places, but they all add up to the social factor that is the most important ranking aspect today.
If your site is preferred by people, it is preferred by Google.
Nice post Rand. I would've added one more line depicting the increasing importance of semantic vocabulary over time (both on and off page).
Rand's model looks at ranking factors over time, but it's useful to look at the site's historical performance as a dynamic model to see what it's relying on to rank at different parts of its life cycle.
Sites we have worked seem to have life-cycle phases: new buzz, build-up, critical mass, peaking, momentum, and decline as the S-shaped technology replacement curve kicks in and the crowd moves on to a new site.
What "seems" to make sites rank is tied to the site life cycle. New sites need on-page keywords to get ranked at all, and need some social buzz to increase traffic to get into build-up, where link-building takes on a life of its own. When you get up around 3K to 4K backlinks, the backlinks start to come on their own, achieving some goofy kind of critical mass. By this time, the trailing edge users have found the site and the site peaks in traffic.
With the site age and lots of backlinks as factors, the site seems to have momentum that keeps it well-ranked even without needed maintenance. By this time the trendy leading edge people are long gone, laying down links to newer sites. The sites keywords and aging links keep it in the running for a while, it retains authority and site age, but newer sites start to win out on new content, spikier traffic, and rate of change in traffic that mark them as being in the build-up phase.
So, perhaps this would be an orthogonal view of the situation, complementary to the ranking factors view. A way to think about why and how a site is ranking at different phases in its life cycle. Knowing where a site is in its virtual life cycle might help you make a decision about how best to help within an existing budget.
Hi Rand - great graphic.
Any thoughts on how to interpret this in terms of strategy going forward?
I have clients who run businesses that aren't necessarily trying to be an authority on a particular topic - e.g. a hotel company (maybe I misunderstand Authority here - makes more sense to me if it's relevancy).
Additionally, I'm not sure I fully comprehend the meaning of domain trust. As the trust / authority criteria rises - how do you envisage these types of businesses taking advantage of these criteria - or is this just a waste of time as established & branded sites will be near impossible to move?
If trust / authority are off the table - does a strongly executed strategy in the remaining criteria still amount to more than hill of beans?
Cheers,
Andrew
Lovely post... I wonder where you get these information from. These information sure isn't available for the common man.
Gr8 job thanks,
I love the graph and I agree with your conclusions.
One thing I would add though in reference to your exact match keyword domain statement is that rather than just:
"That's not to say that things like exact-match domain names + lots of anchor text from diverse root domains can't still overwhelm the occassional page from Wikipedia, Amazon or the BBC, but the preferential treatment has reached new heights."
I would contend that having an exact match keyword domain name is in fact a shortcut to being recognized as an authority site. In other words, it seems to be that if you can find a domain name ending in .com that is an exact match keyword domain name (I talk about what exactly this is in a previous YouMoz post) then you can recognized and treated as an authority site much quicker than if you were to try and build out and rank a different type of domain.
So buying old trusted domains would be a way to go then? I see a new war coming between old domain buyers and Google...
Why else would Google become a registrar but then still opt to use Godaddy to handle their domain service? Google wanted direct access to complete whois information so that they could identify suspicious domain transactions.
Great way to display your ideas on ranking factors. The big question is what makes your domain trustworthy?
Yeah really. In my mind, trust/authority is a combination of all the other factors, but I think Rand interprets it differently. So what goes into trust? Big brands seem to receive this automatically, but how does the little guy build it up?
I've been thinking about this problem for some time... I'm a software engineer and have been recently getting into SEO. So my theory comes from my engineering side of the brain.
So how I would tackle this problem if I was Google?
1. Temporarily increase a smaller sites rank in the SERPs for a competitive keyword, for example 1% of searches that day.
2. Use the data collected to analyze the return rate (users returning back to Google results) of click through's to that website.
3. Modify the sites ranking for that keyword based on the return rate. If users did not return back to Google to check other results, it is likely that the users found what they were looking for at that site. If % of returns from the site is greater than other comparable sites, then decrease ranking.
This would apply to all sites/results and would be only a part of the ranking criteria.
I'm not positive if I have seen an example of this in analytics of the company I work at, but I notice random traffic jumps in competitive keywords for short periods of time.
I'd love to hear if anyone has some input on this theory.
Edit: My reply was actually supposed to be to interactivevoices and Gaurav Kohli on how new sites would ever get exposure in the first place. Though it applies to this discussion also.
Great chart Rand. I have only been introduced to SEO since 2007, so I'm always looking for ways to improve my SEO History knowledge.
Like some of the comments above, I am wondering if new metrics will be considered in the future.
TMI! You're killing me Rand...If my competitors get wind of this, I'm going to lose my competitive advantage :-0
Good post and very timely. We've been working on a site critique and were asked to consider whether the URL should change. This has certainly got us thinking about this aspect of the brief again...
When i first started (2004) you could get a top ten listing on G.uk by adding a page of keyword heavy content and firing some keyword specific links at it. When i think about it, i still do something similar! But i no longer think about writing those pages with a 3/4% keyword ratio or to please anyone other than the visitor.
Where would you put the line for title tags? Straight across the top but dipping slightly in the last year?
Has anybody recently observed the PR of their pages going down a bit? Some of pages of mine were having PR 4 and they got down to 3 in last week or so. I was wondering if something has changed recently which I am not aware of. Or anybody is facing similar situation.
very cool post.
I would add that excessive, repetitive anchor text links - in my experience - can have a lowering effect in the SERPS on the targeted terms, especially if the linking domains are low to mid quality (excluding all spam).
I read something recently about a domain's registration playing some form of relevence in determining trust.
I believe the blog or article said something about domains that were going to expire might be viewed as temporary, and therefore, SPAM sites.
Does having an extended domain registration with a company that doesn't do domain registrations on a short-term basis affect how search engines view their trust worthiness?
Thank you!
Lorianna
"I read something recently about a domain's registration playing some form of relevence in determining trust."
I have heard this as well. I think that there is some small benefit to registering a domain name for the full 10 years rather than just 1 year at a time - although I am beginning to have second thoughts about this because some registrars don't even allow registrations for longer than 1 year (1and1 is an example of this).
I have seen something similar. I have heard from a few individuals that a 2 yr registration is the best. That way search engines don't think you are spamming. I don't know that anything longer has a benefit either.
Also, they say that public registration is better. Have you heard anything on that one?
Thanks Randfish for presenting an obscure topic in a way that ordinary people can seen. I prefer simple absolutism over abstract relavitism, my memory functions smoother that way.
I'd really love to see this graph updated for 2011.
I agree that Trust has become much more important over the last few years, however this is not an independent quantity, I think, and most likely related to the quality and quantity of backlinks, so in a way Trust and backlinks are intertwined.Also I think it is not quite correct to compare Link juice and anchor text in external links as those are not separate entities - I would rather compare "untargeted" backlinks and "anchor-based" backlinks (unless, of course, by "link juice" you mean solely internal linking)
Excellent. Any plans to update this in 2011?
My SEO career goes back to year 2002. As Aaron allready noticed low quality spammy manipulative links worked than... and work even now. The secret lies behind deep link ratio - you can't link all those PR4+ pages to your homepage, for every link to the homepage add 4-20 links to deep pages, 100 links at once. Go on, give it a try! Trust is the only thing we can't measure, so let's play with the things we can.
PR really is such an unimportant thing nowadays, I wouldn't worry about it at all.
Definately useful in quickly spotting problem sites though.
this issue of trust and authority now explains something to me that i've seen only recently in my own niche area which is a mexican resort town - within say the last year i started noticing a different kind of site ranking in the top 20 results for the town's name, the main search results, i started seeing fodors, frommers, wikipedia, wikitravel, lonelyplanet and other whom i had never seen before. now i understand why.
thanks randy. and love the graph it keeps these concepts easy to visualize.
Rand, this is another excellent post and I have just finished reading my southern UK colleagues comments (Mark Nunney) and totally agree with him on the point about the contribution of quality links.
I recently launched a new site for a client just over 6 weeks ago which came with a brand new domain. I was astounded to find the site has already achieved a PageRank 4/10 and I am basing the site achieving this from two factors - On-Page anchor text links (weaved into good quality original content) and good quality inbound links to different category levels of the site from carefully selected external websites.
I have also witnessed with some of my own competitor sites whereby the website might be sitting with only 25 pages directly associated with the site and literally no anchor text between pages but of course the website is sporting a bundle of quality inbound links pointing at the site.
However, I might be at a crossroads where Mark talks about Trust Authority, I think I recall a certain Rand made video I got recently that explained in some detail how Google was calculating on this factor.
This is all excellent material for debate, much better than what I heard yesterday here in Scotland, when a certain PA/Media company told an audience to focus heavily on their HOME page and try to aim for operating at least 30 websites to support your business!!! Can you believe there is people out there saying this stuff - unbelievable, they need MOZED!
Im totally agree, google has changed its algorythm over past years. That's why i think SEO must adopt a different approach to face these changes.
Hi Rand
Thanks for the insight. Great post. I'd still like to believe on-page keyword density will be of significant importance for determining ranking of a site as a keyword query should bring up a site that has relevant content essentially. I agree that "more" won't help but the right balance with the most relevant surrounding content should still play an important part of SEO.
Alain.
Now for the million dollar question, how do you define authority through an query?
Google is trying to steer away from links (not entirely) so how else could they measure trust of a brand? I'm thinking Google will start looking towards social media mediums (digg, reddit, Facebook, Twitter, etc), how or when is anyones guess really
I'm thinking more along the lines of:
- time on site
- pageviews per visitor
- bounce rate
- % of returning visitors
@ Interactivevoices: If Google will take these parameters into account for ranking, how will new sites and business get exposure in the first place. They will always be second fidle or will have to shell out money for adwords etc.
Gaurav
nice post, always love graphics much easier to understand at midnight :)
how does these changes affect future plans for MozToolbar?
Awesome research.Id like to see a lot more in this area.keep up the work & effort.ive done some serious, SEO on the following site 7 am upset with my rankings.All was done perfectly.Sometimes im just not sure if seo & traffic are real. [link removed]
How about adding "freshness factor" - ie rate of updates/inbound link frequency to the graph, to account for the increase in Query Deserves Freshness as a factor. I would suggest the line for that would looka little similar to the "Trust/Authority" line...
I'd say this graph is spot on from what I'm seeing in my niche.
An interesting article. I've definitely had the opinion for some time that Google hugely favours authoratative sites. However, I think it's important to take a holistic view of SEO. All of the aforementioned criteria are important and are woven together.
That's an interesting post ! Im totally agree, google has changed its algorythm over past years. That's why i think SEO must adopt a different approach to face these changes.
I agree with everything outlined in this post, but I'd like to add "timeliness" before the SERPs were updated as frequently as they are now. I remember back in the day (2002-2004) when the Google SERPs were updated once every 3 days or so and news stories weren't included in the rankings. So whomever came across a breaking news story and changed their homepage title tag to target that story first was the one who was going to rank in the top position for about 3 days and get slammed with search traffic until the major news sources and authoritative sites eventually bumped you down. Ah, those were the days... maybe I'll write a YOUMoz post about my experiences. :)
For someone relatively new to SEO it's very interesting to see how it has changed over this period of time and especially where it will go in he future!
On the subject of domain trust/authority... do you have any thoughts on purchasing 'used' websites who already have a history? as apposed to creating a completely new domain?
OK! So you've done a lot of "linking" and now you rank...
but your bounce rate is going to be high and conversions are going to be low if you don't have links coming from the "right" places.
At the end of the day. Most people care about making the sale or conversion, not just impressions.
I feel we are heading this direction, and Google will change the algorithm again because Google's rankings are really getting spammy. Am I wrong?
Love your little graphs, helps the post so much. I am new to this area so good to see where SEO has come from
Great Post, I think we will see in the very near future, the importance of "on page keyword usage" go down to virtually nothing.
But will Raw Page rank continue to decrase? It seems to me that with amount of urls being generated with sites like digg and twitter this will need serious attention in the future.
I wonder if it will get cut out completely or if it may become important again as a way to push this new influx down in the SERPS.
I would love to get some of the seasoned experts to comment, I have not been involved long enough to try and venture a guess.
Again awesome post rand.
i dont think on page keyword usage would ever go down to nothing, it is still a good indicator to see what a page is about, especially for websites that dont utilize title tags and header tags
i also think links will always be important
The more competitive the niche is then the less important on page factors like keyword usage become.
i.e. If you have a site about "polka dotted rhino mating rituals" then all you have to do to rank is simply include that phrase on your page once or twice and since no one else (I would think) has that phrase anywhere on the web then you will rank for "polka dotted rhino mating rituals". However, if you try to rank for a competitive term like "debt consolidation" then there are likely a million and one sites that have great one page factors with the phrase "debt consolidation" all over their page copy - this is an example of when ranking boils down to almost all off page factors and domain trust.
All that to say, on page keyword usage will always be important but likely just as a very foundational element to ranking. Necessary but not going to do the job by itself in competitive niches.
Couldn't agree more - I feel that the "trust" of a website will be the highest factor in no time, and links from those sites will be more important than a hoard of lower, well targeted (anchor text wise) links.
But that's not to say one should stop focusing on anchor text, or even on page SEO factors- it may give you a little boost, but a little can help a lot :D
Great post Rand awesome stuff and awesome graph
Loved this - got everyone in our office talking and discussing.
Spot on, absolutely & again love that chart.
I'm a bit concered re biases to Brands rather but am reserving judgement until more evidence of "vince" effects are known.
I am more than happy that authority sites are favoured, however, I would say that this has been the case for many years.
I have worked for two UK sites that both topped the rankings (following substantial content and link building) for multiple SEO related keywords.
Once we had acheieved and maintained these positions for appx 2-3mths, our much longer tail and even unoptimised phrases began to rank v highly too without any specific effort on our parts.
To top this every time we added another client to our portfolio page, within a month we would be outranking the clients actual site!
I really hope that brands do not get preferential treatment, the beauty of SEO is the level(ish) playing field that we work in, that any site can achieve great things if it adheres to guidelines.
Not only that but the guidelines generally mean that the best sites are at the top of the rankings.
Branded sites will always get traffic from their brand search, a hell of a lot more than organically. SME's on the other hand do not have that luxury and need to achieve rankings for generic/specific kw phrases.
Lets face it, Nike don't need to be ranked for trainers, if you want a pair of Nike's what are you going to type into a search engine??
The graph explains every thing. Previously thought keywords on page has need value. But through the graph it isn't. I'm in a matter how can we increase domain authority. Can any one help me on this.
Great Post Rand.
How could you keep your "domain trust" if you are doing a 301 redirect to a newer site/domain? or does the 301 just carries the link juice and nothing else?
This is great graphic to think about, I personally feel Google is doing everything it can to provide true search results to the end user through their algorithms while balancing that with having a successful business model. Google is constantly trying to work against the spam aspect of SEO. I like Aaron Wall's response about anchor text has been on a slide more since 2005 which you have to believe to be true if he said it.
This graph would, to me, support the numerous blog entries I've read of late that would suggest Google are starting to rank brands higher in the SERPS, for example:
https://www.seobook.com/google-branding
To me, "brand" is another way of saying "authority", though I do hope I am wrong as I feel this would lead, in some cases, to the brands being rewarded for little SEO work whilst the smaller sites wouldn't rank as well no matter good their SEO strategy is.
Quite a flawed argument there for me. Why should a site with "a good SEO strategy" rank above another website which the general population knows through common knowledge is the leader in its particular market?
Google needs to return the most relevant results for the human user, not the one which happens to fit their algorithm slightly better. A site with poor SEO does not necessarily make it a poor website and a poor company and vice-versa. Strong brands are strong brands because they are respected and/or successful in their marketplace, and Google would be foolish not to recognise this.
Just means more work for the SEO guy... which can mean more revenue! As an agency SEO, I see opportunity, not crisis.
Thanks for the article! I think as well that the major experience is the big growth of trust as a ranking factor over the years.
One thing I'd like to read your thoughts about is the composition of the ranking factors. Name them JOTA:
J - JuiceO - OnpageT - TrustA - AnchorsOne could assume the factors are assambled as (J + O + T + A), but I have the impression it might be different. It is rather something like T x ( J + O + A ). The reason I think this would be important to know is, that the second term factors (J + O + A) would each still be important, as their value would scale with the site's trust...
This is a very interesting graph. Sometime I wonder, maybe "content" has absolutely no say in where a site rank!! Looking into the future though, search terms are getting more and more natural, people search using natural terms, like instead of searching "best PC", searches are now becoming, "what are the best PC in the market?". My example maybe not that good, but you get my drift, right? Something to think about!
You are right ! that is the same experience i have with SEO
This graph is on the one-hand helpful as it steers people toward quality content and a good user experience. So for an intro-to-SEO, this is a great graph.
As said several times above, but I'll say it in other words, not helpful for advanced SEO professionals. The term "trust / authority" is FAR too vague and likely confounded because it will include some of the other factors on the graph.
And furthermore, Trust is a human emotion / cognition. Are we saying that Google can "think" ? I think not. As complicated as it is, it's still a set of equations based on mathematical principles (aren't we all? ;-)...
I've always thought of "Trust" from an SEO point of view... as a metaphor to how agorithms attempt to determine what is the most accurate, relevant information to present to humans.
Basically - using the technicals it finds to determine how trustworthy and relevant a source is, getting as close as possible to how a human would think (yet)
I do agree though that this is a vague way of explaining something for Web development - as I've seen it the only real direction that saying "trust/authority" gives to a Webmaster is ...... build a legitimate Website filled with accurate and useful information. Good advice, but pretty much common sense, IMO.
SEO and the Web are both better with this principle in mind.
....or maybe it's Google's big psychological conspiracy to control the world's information. :)
Nice graph, Rand. I tend to agree with Peter and the others on the content - relevance side. After reading through their new patent, it absolutely looks like this is the direction they're headed.
I've seen some competition getting around this through the use of multiple domains with geographic modifiers, like www.domain.com, domain-detroit.com, domain-atlanta.com, etc and cross pointing to each other.
for a good example of what I'm talking about, look at this google search - crime scene cleanup, houston
you'll find there's a site ranking around #2, that's hosted for free - they just copied and pasted a wikipedia entry, and pointed a dozen or so links to it from related domains.
I think this puts more pressure on Google to really check the link relationships between sites, or else there's going to be a LOT of spam sites popping up.
I like the concept. Domain Trust helps with the rankings of a particular website, I don't totally agree with the decline of traditional keyword optimization, but that's just me.
What i'm really interested to see here is how Google intends to use the Analytics data it currently measures and turns that into another metric used to rank websites. This could be detrimental to tons of domains out there.
For example, blogs don't typically recieve tons of page views and the bounce rate is traditionally high because the visitors that keep up with blogs are visiting the most recent post (one page) and leaving thereafter (High Bounce Rate). So, if page views and bounce rates are measured in that sense...*OUCH*
I also would like to know more about how "Trust" is established without linking. Thoughts?
Very interesting post. Thank you.
However I have to say that authority and website trust is the key that will help you rank faster.
While the backlinks including the rankings of linking websites and their anchor text, is the true path for how google will increase your SERP for those specific keywords.
Just from my opinion that #1 and #2 should be switched, but there is no doubt that both of these are the major elements here.
I couldn't agree more. I do believe that anchor text still important yet not and the sum of those factors implements google's algorithm; something every webmaster is looking to understand more!
Authority / trusted sites are quite powerful I agree. I was very happy to see the graph data as I'm starting to get deeper into SEO data mining and data visulisation and have been trying to test different variables in a semi-controlled experimental setup. Now on the other note... from what I can see a highly skilled link builder can clock up a tremendous link portfolio of blog comments, forum links and other low level links and outrank the sites that have fewer but higher quality links. I think quantity still plays a certain role, perhaps is the resource diversity that gives it the kick. Before I make any final conclusions some more testing will need to be done. Curious to see if anyone has done any testing in this area?
This graph says it all.
However, for local sites, I notice that the on-page keyword usage is really important since you're dealing with local sites and they tend to have lower traffic, lower inbound links and lower trust.
Anchor text matters a lot too and for local sites, its' usually pretty easy to get others to link to you using a specific anchor text.
Great post! Appreciate the graph, gives great insight where SEO is heading.
Thanks for your point of view. It's interesting for newbies to find out how Google changed their way of view websites during these years.
I think that in the future Google and other main players will look of data updation. So the sites that will update their data and write new data will be ranked better. What do you think?
Rand,
In short, content is finally becoming what is really important. All of these factors are really starting to rest on relevant content either on-page or via links. It has been a long time coming.
This is really very helpful, thank you. In this regard I'd like to ask one question I've been trying to find an answer to. Suppose, I have links to my site coming from a high authority domain but from pages that have a very low Page Authority. How would you estimate the effectiveness of those links? Should they be very effective or not at all or somewhere in between? Would appreciate hearing your opinion. Leonid
The real question you need to be asking is, does it need replacing?
To build trust with the search engines, I believe that we must first build trust with our visitors.
Thanks for beating me to
the punch https://www.chilerosas.com rosas
Thank you for the visual - I find graphs far more useful for clearly illustrating trends over time. While this post and the prior comments are focused upon the tree, I would like to ask that some step back and view the forest or even the entire World.
These trends and a comment with link above about branding indicate that Google may be moving towards favoring corporate interests over everyone else. Given that they have a virtual monopoly on both search and paid traffic - roughly 60-70% for most sites whose analytics I can access - that is an extremely serious issue for small businesses and bloggers everywhere.
The best solution is to do whatever we can to diversify WHERE WE SEARCH. Blog about other ways of locating information online. Back independent solutions. Recommend other methods on Social Networking sites.
Yes, I AM well aware that there is not clear alternative available today. Even though I prefer Zuula often the only results I can pull with it come from Google. If we do not do something NOW there may be next to nothing we can do about this later.
You may want to start recording your favorite resources somewhere just in case you can not find them later. I searched for two days seeking posts and forum entries about dropping ecommerce sales and found only information from a few years ago. I can usually find anything if I research thoroughly enough.
The best solution may be to start asking each other directly or at places like Twitter instead of using Search Engines. Shop locally and patronize small online businesses. VOTE with your time and money and stop supporting the multi-national corporations that created the economic mess we are seeing today.
Every search engine which would favor big corporation more then searchers' need for quality information will lose it's searchers and Google is pretty aware of it. They achieved todays market share because they understood it better than others and they've beaten every search engine in the past (Altavista, Hotbot, Excite, Yahoo, MSN, Webcrawler etc.).
There are many assumptions in your reply:
Great Article. Keep posting.
this article is amazing. I really think that seo will become more and more social in the future so that trust rank will be the key of the success in seo area
Wow a great post again...Thanks...
But i think that Hosting is also matters for google ranking. This is my personal experience about the hosting.
Host from a Branded Company Always... This will also help to get higher PR and Higher Ranking with the use ofGood On page optimization and Proper use of anchor text.
Excellent article. The times have really changed. Search engines are now getting more social, much like myspace or facebook. https://www.nova-leap.com designed my website a while ago and they did an excellent job, but I plan to have them do more seo. Thanks for the article.
Hey any one want to buy a good domain that would be perfect for an LLC or a company, i just bought it a min ago: DigitalDistrict.com, im poor so take full advantage of that and give me an offer!
excelent, i would love to hear the toughts of an Search Engine algo Expert from Google (Matt Cutt)