(This post is aimed at business owners and new Moz users studying search engine optimization, not full-time, experienced SEOs. This is an in-depth answer to a frequently asked question in the Q&A forums. If you are a full-time SEO, this is one post you can easily skip.)
Regular readers of the SEOMoz Q&A forum will likely notice that the “general” SEO questions come up again and again as new users struggle with the same questions we have all asked at some point.
- “What am I doing wrong on this page?”
- “How can I improve my SEO?”
- “What/where can I do link building?”
- “Subdomains or Subfolders?”
And the one we will address here:
“Why is example.com outranking me for X and Y keyword, even though...?”
This question has been posted in various forms many times in various forms lately on Q&A. Here are a few:
- https://moz.com/q/competitor-out-ranks-my-site
- https://moz.com/q/how-does-this-site-rank-higher-than-a-seemingly-better-one
- https://moz.com/q/how-can-this-website-be-ranking-so-well
- https://moz.com/q/why-would-a-site-with-virtually-no-external-links-and-low-authority-rank-very-high
- https://moz.com/q/why-is-this-site-ranking-better-than-me
Non-SEOs, mostly small business owners, come to SEOMoz to learn a bit about SEO and hopefully improve their site. You learn about Domain Authority and Page Authority and work hard to improve your MozTrust and MozRank. Those metrics are readily apparent and easy to track with the SEOMoz trial.
So you sign up for the trial, run your brand new SERP overlay tool and see this:
The Keyword Difficulty score for “credit card debt” here in Australia is 53, which is rated “highly competitive.” Is debtrescue’s very high ranking an aberration? Maybe this keyword is an anomaly and other sites do not “make sense” based on metrics, either. But that is not the case. In fact, this ranking (debtrescue is 2nd on my search, 4th by the Keyword Tool’s search) has the lowest Domain Authority by over 15 points and the 2nd lowest Page Authority by 22 points.
This is not really close – by the stats, debtrescue.com.au has no business ranking on page 1. Just for information, the 9th site (95 DA, 1 PA) is a .gov site so we know why this particular “PA 1” ranks this well.
So you own a business and you are new to SEO. You have looked at the SERP overlay and it makes no sense. You look at the DA/PA chart on keyword difficulty and it is fairly clear that this site does not rank based on metrics. Let’s pull just one more set of data to make the point.
Our target site has nearly the lowest mozrank, by far the fewest links of the non-.govs, the lowest subdomain mozrank by a lot, the lowest Page authority by a wide margin and the lowest domain authority by a mile. The page 2 results for the same search shows a trusted bank site – NAB.com.au – they have a DA of 82, PA of 44. They have 300,000 links.
So our question is clear now – how can a site like debtrescue outrank anything else on page one or NAB on page 2? The metrics all say these other sites should be winning handily.
So what is the magic answer?
That elusive magic answer …
I could give you twenty, maybe fifty possible answers. Unfortunately, I could give you twenty and the right one for your situation may be #21. So let’s ask the question again and then reword it so you can see the problem clearly.
“Why is example.com outranking me for X and Y keyword even though _________?”
The real question is this:
“Aren’t these Moz metrics what count?”
That leads us much closer to the answer: these metrics count, but they don’t come close to telling the whole story.
According to the 2011 Search Engine Ranking Factors on SEOMoz, many SEOs think Page Level and Domain Level Link Metrics account for roughly 43% of the algorithm. Over half of your total ranking is determined by factors other than Page and Domain link metrics.
Are you starting to see the bigger picture? How sites rank on Google depends on many factors other than the few you are introduced to when you start learning about SEO. Pagerank, number of inbound links, and the main Moz stats tell a story about your site. Unfortunately, it is like a two-part season finale of your favorite show. You didn’t know there was another part next week and now this?
So what is the rest of the story? What else matters?
If you ask 100 SEOs for the top 50 ranking factors, you will get back 100 completely unique lists. The fact is, we don’t know the whole story. Google can’t come out and tell us the whole algorithm because people would take advantage of that information to rank poor quality sites simply by manipulating the ranking factors and focusing on nothing else.
Here is what we know: Google tracks certain data. You should believe that Google tracks data it finds relevant and reliable. So what can you find in Analytics and with a bit of testing?
Visit duration matters. This metric tells Google that someone found your page and enjoyed it enough to stay on site longer than when they went to your competitor’s site. Duration is a sign of the page’s overall quality and usefulness.
Pages per visit matter. If a user searches for “credit card debt” and visits just one page on the #1 site then seconds later bounces back and goes to #2 where they spend 15 minutes and visit 3 pages, any guess which site Google thinks matters more to that search query?
On page factors matter. The days of putting your keyword keyword keyword on the page three keyword times are keyword thankfully over. This will rarely help you and can actually penalize your site. However, having accurate and descriptive titles that match the content of the page and the user intent of a search? That does matter!
Algorithmic and manual penalties matter. If you put that keyword fifty times on your page, as we discussed above, you will run into an algorithmic penalty. This means Google knows you have “stuffed” the page and will take a bit off your ranking. Do it repeatedly and you’ll get a much larger penalty. If a Google manual reviewer catches you doing this, or a competitor reports you for it, you may get a manual penalty. You will get a notification in your Webmasters Tools about this one and you do need to fix it or suffer a harsh penalty.
Social Share Metrics Matter. Some SEOs believe social metrics matter very little. We have tested these ourselves and found that social metrics matter “some.” If you get three retweets on your link, no, it won’t help you very much. You won’t notice a bump. Get 200 retweets on that same link, have it shared 50 times on Facebook and now it is pinned 8 times? Yes, that will affect your ranking.
Anchor text matters. Don’t misunderstand what I am saying here. Anchor text can be positive *and* negative. If you over-optimize a handful of keywords, it will be a negative effect. If you balance these, include branded anchor text across a variety of link types, it will have a positive effect.
301 redirects matter. Your competitor with the poor metrics may actually have fantastic metrics – on another site. They may have changed domains and redirected all that great SEO juice over to the new site, which looks like it has no value but is absorbing a lot of the value from the previous domain.
We could keep playing this game for a while but we won’t. You should understand that the metrics you start with are a small part (less than half) of the overall picture. We also know of certain “exceptions” to all these rules, such as the ranking bonus new sites get when they first appear on the SERPs. You can’t control how Google sees other people’s sites so try to stay focused on what you control.
For even more information, be sure you go back to Eppie Vojt’s phenomenal “How Garbage Ranks in the SERPs: A Case Study”
Eppie posited that manual penalties may occur because “Google just can't allow low quality sites to outrank billion dollar brands for high visibility terms” and I tend to agree with him. Sites that do not “deserve” their ranking often fall off quickly or get manual penalties. Ask yourself if this site was there a month ago – and then check whether it still ranks in another month.
When you are done with Eppie’s excellent article and you want to learn even more about how truly bad sites rank, continue your education with Wil Reynolds' “How Google Makes Liars Out of the Good Guys in SEO”
I hope this explanation helps some of you who are just learning and starting your SEO journey. I also hope the more experienced SEOs jump in and give a few of their ideas on ranking factors, why some sites rank well despite their average or even poor Moz metrics.
I'd like to add one more reason: Google uses their own algorithm, and we haven't been able to convince them to use Moz Metrics for ranking or to give us access to an API for their algorithm. :)
Exactly! I think after people signup for a trial of SEOMoz, they sometimes see these few metrics and think that's it - if they can improve those, they win. And that's *somewhat* true... but not the whole story by far. :)
Thanks!
I need exactly this info. i came here by search it on Google. Very Very thank for that
Keri, you are BRILLIANT! That's the response I'll be giving from now on - "I wasn't able to call and convince Google to move you up, sorry :( "
Good post, Matt! Kudos!
Thanks Max! Appreciate the kind words - watching my first post is nerve wracking. :)
Thanks for you blogpost. It's awesome, Matt.
Nice answer Keri.
I would also add that first two results in the SERP overlay above have keyword rich url's and third not, so that could be reason for page with lower PA outranking page with higher PA.
Thanks for nice post Matt!
Darn Google and no API access to their algorithm!
Nice one Keri. As Matt mentioned its not easy to come to conclusion on how exactly a site ranks above that of yours. Google has more than 200 search engine ranking factors which are fair and the algorithms does a smart job of separating the wheat from the chaff. Also Google shouldn't and won't make each and every ranking signal pubic as most people will always make the best use of the wrong side of it. Having targeted keyword at the beginning of a page title gives a ranking boost, over stuffing of keywords in title tag makes it looks spam and also the rankings drop down etc., these kind of experiments have been done by other webmasters to know what works and what not. SEO is all about experimentation.
When google can treat keyword 'Dr. Surname' and 'Surname, MD' as the same person I'll have more faith in humanity. Some SEOs have used 'Dr. Surname, MD' but this is redundant as you've already said they are a doctor going informal and formal.
Wait, what? SEOMoz metrics aren't all that matters? Are you telling me my $99 doesn't buy me guaranteed rankings? What a rip...........J/K!
Great Post, I'm sure many new Mozers will find this beneficial!
Haha! Sorry, check with the clerks on your way out. ;-)
Thanks for the kind (and funny) comment! Appreciate it.
Great post - people need to stop thinking about rankings as a video game always trying to get more of x and y to be more powerful. Google wants to see you as brand with quality and relevant material and value to add that people like share and link to in places that have traffic, People need to Chase that, not the algorithm.
Exactly - I couldn't agree more! I think it's harder to "make good content" than "add more meta keywords" so people are definitely still figuring out this SEO Video Game and they aren't necessarily loving what has replaced traditional quests.
Awesome post, Matt.
One thing I want to point out is link relevancy. This is something Moz Metrics can't pick up and is HUGE right now.
One reason I've been giving my competition an old-fashioned beat down in the last year is that I've been ignoring most metrics...and focusing on the relevancy of the sites linking to me.
If you look at the sites that hit the top 3 (and stick around), you'll notice they have a very small number of authoritative, relevant links.
So true! And as I said, "we could keep playing this game" because link relevancy, co-citation, page speed, all not mentioned but I would say all important. Authorship, not mentioned. I would say it's very easy for new business owners and especially new Moz users to get enamoured of a single metric. PageRank is always one I hear. "But my Pagerank..."
Can you talk about why and how you believe visit duration and pages per visit may be a factor?
These metrics are a bit ambiguous because visit duration and pages per visit, without context, are not obviously a good or bad thing either way (except if they bounce immediately, of course.)
For example, if someone lands on my landing page and completes my form, that may be a visit duration of 1 minute and 1 page view. If someone lands on my product page from a long tail product search, adds an item to cart and checks out, again, that may be a very short visit with low page view count.
Thanks.
I think there are two partial answers to your question and maybe nothing satisfying since it's definitely a vague topic.
1) My guess is always that Google does things a lot closer to perfect than we think. Stumbleupon traffic hits your site, almost always immediately bounces. It's crap traffic. If that's all you ever get, it harms those metrics, for sure. I don't know what is going on at Google but for me, I would want to know the page size, the percent seen, and where the search keyword was - did they get to see it?
For instance, say your contact page gets hit and immediately they send the form, per your example. I think Google can figure out that they searched for a page that included a form, they used a contact form (usually some type of redirect, coding change, etc.) and then they see they leave but don't go back and necessarily search again. Same thing with the product searches. If someone searches "cheap cell phone" comes to your page, buys the first thing they see - they're *very* unlikely to go back to Google, do the same search, and buy a phone from #2 as well. So the visitor behavior should definitely change.
2) I think the other part of the answer is in your question. You mentioned "without context" and I agree. I just think Google sees more context than we would assume from a simple search. Searchers don't usually just "give up" so if you are the last site someone hits, and you hit a page called "cart" and completed an Analytics Goal, I'm going to assume that Google can tell that's not a bad thing.
Again, these are assumptions based on my own testing and client data but it seems to matter. We had a client do a big promotion on Groupon and they had a landing page for the people who had bought the Groupon. It linked to a couple internal pages - both of those pages went up DRAMATICALLY in Google organic SERPs for about 6 weeks. It couldn't be a single internal link, so we figure it was a combination of where the referral came from, how long they stayed, how many pages they saw and/or other "metrics" that we just don't understand fully.
Final thought on this - I don't know that visit duration counts. But Google preaches to us over & over, quality and user experience. I believe if you have a 1400 word post and people spend an average of 11 seconds on it, Google *knows* they aren't reading it all and therefore your quality isn't up to what the searcher expected.
We know there are possibly hundreds of factors - I guess my overall thought is "other things" do matter.
I think, page views and time duration of visit are few important points. I have some some video by Matt Cutts saying that Google data centers collect data send by browsers, servers etc. It is sure that the tool bar installed web browsers send data details for time duration of visits, number of page views etc
Hi Matt,
Thanks for the post, more often than not, we are being this question, and at times, the honest answer is " I dont Know"... and even though you can understand why the site is outranking us, its even harder for us to make our clients understand the same.. Anyways, thats the added "task" if you are into online marketing.. My question is as follows:
I can understand that Google can track that how many visitors are bouncing back to serp from a website, the CTR , the "dwell time" sort of things which Google can use to understand user behavior.. But, number of pages visted, pages/visit sort of data can come only from Google analytics.. So are you saying that Google use Google Analytics data?
Salik Khan
Hi Salik!
Yes, you're right - it's often "I don't know." Or even as I said "Unfortunately, I could give you twenty and the right one for your situation may be #21."
To answer your main question, I think Google uses everything they can get their hands on. For Chrome users, that's different than IE users. For Analytics sites, that's different than those who use Statcounter. I would say if Google can get to it, they use it - or POSSIBLY use it - and that's the biggest thing we need to keep in mind. We don't know the algorithm - so when someone asks "why are they outranking me" the real answer is "something the other guys did is better than something you did." :)
There has to be a benchmark in order to rate websites.. If I am not using GA on my website, its not a fair call to judge other websites using their GA data.. I believe that Chrome has bigger role to play here, with growing popularity of Chrome, I guess it is no more a browser, but a data/user behavior collector..
Thanks for your reply...
Thank s for the awesome write up
Jeff
I wholeheartedly agree..
As an SEO trainer I always make sure I teach my clients to firstly create useful, original, loveable content, and then optimize it!
I work with them on the useful and original part just as much as I do on the optimization part.
I would add that how often you update your website is also something important that we don't see in the metrics. Sure, quality beats quantity but a good frequency keeps you up in SERPs.
Yes, yes and yes! Frequent updating helps - especially if you have a page called /blog. People don't really realize that there are certain page types Google must be aware of. /contact, /contact-us, /blog, /news, /portfolio, /services ... you really do need to do what these things say. If you have a blog, update it. If it's not going to be updated, remove it. Either, or. :)
This one of the best articles on SEO I've seen in a long time because it addresses the questions no one wants to ask. With all this knowledge and experience, why do some sites rank for no apparent reason? I'm afraid it also begs the question of whether MOz's DA and PA are accurate - because when those metrics start to get it right more consistently, they'll become more useful for SEO.
I think DA & PA are accurate reflections of the overall "power" of that page/domain. What we're not taking into consideration on that glance is relevancy. If you have a 90 DA, 30 PA page all about trains and I have a 60 DA, 25 PA page all about trains but your main domain is all about cars and mine's about trains, wouldn't my site be the "expert" site and thus rank higher?
And I know DA doesn't work this way but it's a useful way to think about it. If you have 70 DA and you have 69 posts about cars and 1 posts about trains, it's useful to consider that you really have a 1 "train" DA and even though you'd beat me for "auto show in Kansas" you probably won't beat me for "Sheldon loves trains."
Thanks this is a great post. Would you say that patience matters too? I fell like I have been doing everything right for the past six weeks, but I have made no progress in terms of moving up the rankings - it's really annoying me!
6 weeks isn't enough time to rank most competitive keywords. What are you trying to rank? I ranked a wedding planner in like 6 days. But most of my clients see appreciable differences in a month, two months, three months ... and then we can see how we're going. Patience is definitely a virtue or you're going to want to "do anything" to get to the top - which is a short term strategy.
I think the crux of all this is that user experience is what really counts. By creating pages with users in mind rather than search engines means that many of our metrics (average page on time, bounce rate, click through rate, shares, links back etc) are naturally good and therefore you will rank better. Of course following Webmaster guidelines underpins all this, but people really need to create great user experience (including embracing mobile and responsive design) if they are wanting to rank highly.
You said it, exactly. "Many of our metrics are naturally good and therefore you rank better." THIS is exactly what we need to do. If you give visitors exactly what they're looking for every time in the way they want it, they won't leave. If they don't leave, your metrics all improve. If your metrics ALL improve, your SEO improves naturally. You may not beat competitorY tomorrow for keyword1 or keyword2 but in 6 months, a year, 3 years, 5 years, you'll have passed them and stayed well beyond them because you do it the right way.
Hey Matt!
Thanks a lot for answering my question the other day related to this topic. I can see this is the post you were talking about and it's awesome. I did think it was all about the moz metrics, can't deny I also got addicted to trying to improve those numbers. I have learned a lot in just a month thanks to all the people here posting in SEOmoz, I'm confident to say my basic knowledge on SEO is solid now.
I'm bookmarking this post for sure to check out all the links on it one by one.
Hey Eblan! I'm glad you found this! I was actually going to go back to your post and add this link or private message you with it but you saw it first. I agree that when people first learn about Moz and those Moz metrics, it's easy to get obsessed and work on those.
Glad the post (and Q&A answer) have been helpful & thanks for commenting!
Great Post Matt - Your new post reminded me to come back and read this 1 more as well :D
Thanks Mike! Glad you enjoyed the posts and hope they help you going forward. :) Appreciate both comments.
Great Article Matt and i always love huge amounts of Infographics :3I think what could be added to this is more indepth talk about the products Google actually provides (e.g. Analytics) though MozMetrics is great! SEOmoz isn't Google... Having multiple tools (and multiple information, is always good.. the more info we have always allows us to get an extra edge).In terms of authority, it's very dependent.. I have a client with a 10 authority site who ranks better for specific keywords then their competitor with a 34 authority..The only difference between each site is the competitor has the added domain age & more pages (not more content, but PAGES!) Eventually we'll catch up but they have a few years advantage on us.. I'd love to buy a fresh domain and see if i could beat 1 with medium authority and a few years age on it, probably not with the current state of Google re-writing the book, but it'd be fun!
Thanks! I like using screenshots & graphics! I obviously (from the post and comments) like to discuss things in depth and talk a lot so whenever I can say things in FEWER words, I'm all for it. :)
Love the idea of taking a new site/domain and ranking it simply. I may start a blog next week and outrank something just for fun. lol
Hey Matt. Thanks very much for the thoughtful post. I think you really highlighted some important concepts of SEO here.
We use the phrase "correlation doesen't equal causation" a lot in SEO (it's over used in fact) but it's very true. If someone sets their goal on getting their Domain Authority up (only) then it's no surprise if it doesen't nessecarily lead to a top position. Our jobs are difficult because we need to be thinking about a variety of factors and not focusing on one thing. If we were to "think outside the box", we could start to look at Page Speed and other technical aspects (such as 301 redirects), like you said.
Thanks for the comment Nick! I agree that we need to focus on a variety of factors and more importantly, not let clients get hooked up on one factor. I would say that used to be "let's improve our Pagerank" and now I think it's "let's get more links" or here on Moz "let's improve DA/PA and we'll be fine!" And it just isn't.
BTW - good call on "page speed." I usually mention that one every time I talk about this stuff - amazing I missed it on my short list but then again, even my list is 8 or so out of 100, 200 or who knows, 500 factors.
Google's algorithms change every day. It is really hard and time consuming task for web developers
Nice post for newbies. There are so many factors involved in Google's algorithms beyond good content. From my experience, most small businesses assume that Google is "smart enough" to find their website and properly categorize and rank their site. They have no idea what it takes to outrank their competitors, so it takes a lot of educating small companies to sell white-hat SEO.
Great write up! Google looks at so many different metrics and there is no 'silver bullet' that will make you number 1.
Thank you!
Great article Matt. I have just finished my monthly analysis and to my horror found quite a few sites have maintained core business offerings as top rankings, but bombed out of the SERPs on other terms that they have ranked #3 - #10 for over the past 6 months plus.
In analytics many of these terms are not getting the CTR that the core business terms do and I can only deduce that Google updated this particular algo in December as it seems to run through a few sites against the phrases that have dropped furthest.
This concurs with your time on page and length of visit, but I am left thinking it is the CTR from the SERPs that has more weight - at least with the other metrics users did click into the site, rather than leave the data with just an impression.
Working on optimised descriptions and snippets for the SERPS to sell the site to users has been prioritised.
GREAT ranking factor. I think CTR is another one of those hidden things people forget. It's nice to optimize your meta description to be relevant to keyword searches (I don't necessarily think it's a "ranking factor" but I think if someone searches "melbourne photographer" and your description says "Melbourne photographer: hire a photographer in Melbourne, Melbourne VIC AU wedding photographer" nobody will click that crap. Then you get to the next one and it says "Julie has been a wedding photographer in Melbourne for 8 years. View galleries, get pricing." Which are you going to click?
When a site draws that click, it has to help. Google has to think "someone saw that, thought it was relevant. Let's serve that to more people." I would say a #1 NOT getting clicks is a big indicator, as is a #10 getting more clicks than the #5 or 6. Good points!
First of all congrats for your first post Matt... I like your blog as the most of people here do. And, I am again asking the same question here which I asked on other posts on SEOmoz.
Can we really rely on Google if it will not kick off sites again from top rankings even if you are doing a great job with linking, content and everything? Google's algorithms always change and this time all the predictions and good work seems going waste as most of the crap bad content websites and keyword rich domains are taking over all the masters.
So do you think, if it will not happen again even if we keep in mind variation in linking, quality content and all as experts say?
Share your thoughts on this.
Thanks!
Cyrus just had a beautiful Whiteboard Friday on the topic I think you're addressing - how to "future proof" your SEO.
https://www.seomoz.org/blog/high-value-tactics-futureproof-link-building-whiteboard-friday
I would start there. I think he's right - stuff to the right of center will "always work" and stuff to the left will work less & less. And the stuff in the middle depends how you use it. I think automated SEO is going away and even though there will always be those who can and will use it successfully, they have to get smarter and smarter while those of us ranking without "tricks" can just get better at doing the good stuff.
To all the other small business owners out there stick to long term strategies. Quality content along with white hat seo practices will stand the test of time. There are very little 'low lying fruit' these days when it comes to seo. Trust your instincts - don't fall for all the black hat tactics out there or businesses claiming they will get you at X location on the serps.
I would love to see a followup article up from this one Matt getting a little deeper/advanced into some of your points. Keep up the great work!
Yeah. I see myself lot web sites on the first page, and they "don't deserve to be there" by any SEO means. This post explain some points. But still lot stuff unknown. I understand that SEO moz stuff is not all, but still good orientation. Other things - we will find out one day.
I guess the point is to try and find out which matters and which are not, and we could do that by doing lots of tests and keeping records of the methods we have implemented. Btw I do believe that visit duration actually matters even though some of the posters here doubt it, I've seen the prove of it many times.
Ok, but what about the "Visit duration matters."? How exactly it works. For example I'm looking for taxi phone number in city N, I'll type the query and definitely will find the needed result in 1-3 sec. Is it good or bad?
I think I answered part of this in my reply to David here: https://www.seomoz.org/blog/going-beyond-moz-metrics-to-answer-why-is-this-site-outranking-me#jtc210853
But, just to clarify:
I think visit duration matters (on some types of searches.) Informational searches are likely not to be as much of one. If it's a query you expect Google to answer with a Knowledge Graph, I don't think duration matters. They don't even want you to bother going to the sites. For instance, a search of "george w bush" turns up his dates in office, VP, date of birth, etc. right on Google. I don't think they care how long you visit the wiki page.
But if you do a search for say "marketing blog" and you bounce off the first one in 10 seconds, the second in 2 minutes and you visit 8 pages on the third, spending 24 minutes, I think that matters. Otherwise it would be silly of them to tell us to "create the best user experience" if they couldn't track that (I think Google uses whatever data they have - Analytics, search page data, etc.)
One should change the way you look at the seomoz tools.
They are very helpfull and provide analysis of many elements of your webpage.
E.g. level of competition for keywords, average ranking # for certain keyword, basic on page things.
However it is certain that google is not using the seomoz tools to rank websites.
The level of expertise of google is far (far far) beyond what the tools of seomoz can do.
The seomoz tools are very helpful to debug and to track and to get the basics in order. But they cannot answer why you rank #1 or #10.
;-) Not to upset seomoz the tools are great.
But in the end they are just tools and you need to do the seo yourself.
Really appreciated the post!
Yes, that's the idea, I think. If we change how we see the Moz tools - from "this is why or why not" a site is ranking to "this is a piece of the puzzle" everyone will be happier, I think. Now let's convince clients. :)
Excellent Article, it's good that it's gotten so much attention as well!
Good job!
As a small business owner, I am guilty as charged. What is also frustrating for us is when we see an strong correlation like age of site or exact match domains, and those correlations are dismissed out of hand by SEOs.
I think age of site definitely matters but do you want another little secret? So does the length left on your registration. I did a pretty extensive test awhile back with some domains I had never used. A couple had 6 months left, one had a year and a month, a couple had more than a year - like 18 months and 22 months. They were all registered around the same time but I extended a few out, thinking I'd use them.
Anyhoo - fast forward and we found that the one with 13 months ranked better and easier than the ones with 6 months on them until it hit the 11 month mark. Two months in, the year+ domain fell dramatically and joined the worse-performing soon-expiring domains, instead of the better performing ones with over a year left.
The results were actually fairly shocking so I let those 3 domains expire and then watched the long-registration ones. They ended up falling with about 10 months to go. Following the logic of our test, we figured that since most spam domains are registered for under a year, Google must trust domains with "under a year" left on registration differently than those with over a year. I would say older domains still do perform better at that point but there's a serious drop off if you have 8 or 10 months left on your registration, rather than say 4 years.
As a small biz owner I spend way too much of my time trying to "get this right" and end up with a few trials and plenty of errors along the way. Thanks for this insightful post, Matt.
Glad it helps a bit! That sounds like all of us - few trials and plenty of errors! :)
This is probably one of the most useful, helpful posts I have ever read. :) I love it here. Agree with Heather, patience is very important too. Ive found myself frustrated with things in the past only to check it a couple of months later to find pages ranking highly that id almost given up on.
Thanks for the great comment!
Thanks for the informative article, I know there was a time in my career where I saw this happening, a low page authority site out ranking a high page authority site and couldn't figure out for the life of me what the magic bullet was.
My thought now is, everybody is saying we should disregard ranking results, they are too tied to ones ISP, browser history and location. So how do we truly know who is out ranking another in any consistent way for any given search to make a proper evaluation?
I don't disregard ranking results but what I tell clients is the actual data from Analytics "average position" is going to be closer to the truth. Average position is where someone saw you, clicked or not, on a real search. That's going to mean you can have some idea of ranking but not as much a "I'm #3" as it was 2 years or 5 years ago. :)
Great, thank you Matt.
Well said! But understandably frustrating for the recipient of "I don't know" - especially one that's gone by the white-hat text book. I guess Yoda would say "An unfortunate profession, SEO is"
I definitely understand that it's frustrating when we can't figure it out. I'm actually glad for all these comments, especially the next couple below yours, as well, because it gives us the words and ways of saying why we don't know.
It is frustrating when you can't figure it out. It's circular.
"Why are they outranking us?""I don't know but here are a few reasons it may be."
"Well which is it?"
"Ultimately Google thinks that site is a better match.""But why?""I don't know but here are a few reasons it may be." (lather, rinse, repeat.)
Super article, Matt. Yes, there do seem to be a lot of factors impacting the page well beyond page authority/domain authority. I also wonder how the landscape may shift if/when they truly integrate google authorship.
I also wonder about the importance of topic authority (which I think may be what you were mentioning w/ domain level key word usage. My blog seems to do very well. It's about flooring (all types), but I tend to rank much better for articles about hardwood flooring (maybe due to my domain name that has flooring in it), but less so for carpet. (and, not sure how to rank better on carpet).
Just taken a quicky overview of your site and there's a few things i'd change -1) URLs, your blog posts are WAY to long and for example the free quote form is under "contact-form-2.html" which isn't optimized for friendly use as well "westchester-flooring-blog.html" - why not just change it to /blog.html? It'd cut down the huge size of your blogposts (which extend over Google's limit) 2) Image/Design - Having your face plastered 2 times within my 1080p display isn't the best thing for a user or for google.. your trying to sell a product (from what i understand) and associating yourself with the brand, i totally get but "over-association" is a thing too.. About us great, but i'd get rid of the image next to your logo.3) Link building... HOLY S$%T you have close to 10,000 backlinks on a site that honestly, doesn't need it.. If i were you i'd focus less on external link building and a LOT more on internal! that's having your keywords in a blog post linking back to your specific sales or "money pages" - which is a great way to add value to your site.Now i realize i've just given you a bloody audit but i hope this is helping you and feel free to contact me if you need any other support/help. I need to get back to work now before i spend another day on SEOmoz and have a client phone me up screaming about no work done, yet again.Hope it helps- Charlie
thank you so much for your input.
1. I agree w/ your point about the home page/2 picts...and the home page in general needs some work for optimization - visually, contentwise and SEO wise...and that is a project for March. This is due to the limitations on my theme and I need time to figure out the new one. I completely agree. (also, most the entrances (90%) come in through the blog posts where there is only 1 picture of me and I like that better.
2. URL's too long...actually most are wi/ the google guidelines/SEOMoz stuff, but point well taken as some should be shorter and that is a good guideline and there are still some I need to fix.
3. Link building. I'm surprised that I have 10,000 inbound links. I have actually not done that much link building for the site. However, I do have a blog on Active Rain, and I have a hunch that most are from there and then the agents reblogging my content. What I don't have is a much diversity of linking/more domains.
Thank you for all your advice. Very helpful.
I think authorship will be an interesting one. As was mentioned above, if someone spends two minutes or more on your article, and then clicks back, they're taken to more articles by you. What if they were taken to this article by me on Moz and then after a couple minutes perusing these great comments were to go back to Google? Would they then see more of me on Moz? More of me on my own site? (All this reminds me that I forgot to put a bio on this post and no link to my G+ profile. I should get that fixed so I get authorship for this! lol)
On the second part of your comment - you actually rank #1 here for westchester carpet pad - that's what you blogged about recently.
Not that you asked but I'm obsessed with looking:
You have no meta description so Google will put whatever they feel like on search results. The Facebook oauth line keeps slowing your site loading time down considerably. You have 23 validation errors, 16 warnings. Of your 10,600 links on Ahrefs, 9500 are dofollow - seems high, especially as 7000 are "sitewide" links.
That's just a couple ideas but fits in well with my post. You can always find a few areas to improve - over & over. And as you do, you slowly do find those reasons but they're not always obvious.
Glad my comment was able to prompt something to help you. I do think authorship will have an impact, but we shall see. From what I've read, it seems that you would be more of a topic expert - for you SEO, for me flooring, so it might help you rank a bit better on your topic authority.
I'm not surprised about ranking well for Westchester carpet pad. Almost no one has written about it, so not that hard to rank for it. This article was inspired by a customer who found me typing in "Westchester moisture barrier carpet pad." I realized that she found a page I wrote on my old site and I thought it was okay, but not great and not complete, so I rewrote and expanded it on my new site and I was really only competing with myself.
Yes, I still have some red errors on here that I need to fix. Wow, on 9500 do follow links. I wasn't even aware. Maybe I need to share more on social media to balance things out. I don't know what "sitewide" links are. Again, I think the majority may have come from my blog on Active Rain. I need to look into this as now I may have by accident created a monster. this is ironic as I have not been doing a ton of link building, but I do blog daily on that site.
Great post, more happy to know that more business owners are coming at Moz forums. I do follow and keep transparency in my works. I use to discuss each and every seo process with site owners. Some times I show webmaster tools videos and seomoz blog posts to convene the site owner. It is very helpful in keeping a good client relationship. Keeping a good relationship is very important.
Hello , nice sharing i am working as SEO there are some error related to ranking my team work on tourism sites from last 2 years results are very good but from last 15 we are stop working on it now results moving from 2nd to 32 links. We can change or update or remove some links from over site is it possible that this can be effect on our ranking condition? You can help me for some new trends to improve SERPs?
Great stuff. Even the comments for this post are useful! I would also add CTR as a factor in there(not sure if somebody already mentioned it in the comments). After all they tell us they use it for AdWords and there has been plenty of other research done to prove it's effective in SEO.
That was my hope when I asked others to chime in and there are a lot of great comments here with other ideas, suggestions, and things to check out. I'm glad everyone seemed to enjoy the post and hopefully it helps some people. :)
This is an excellent post! I guess I've always known that the metrics we use are only part of the story and no small combination of metrics will give you the whole picture. It's nice to see that concept so clearly described and broken down. My big question is, and it's something I've been struggling with ever since I started researching SEO, how do we cope with this? At the moment experimentation seems to be the approach I'm taking.
Thanks, Matt! This post is very helpful. I am one of the small business owners you wrote this for. I came to SEOMoz because of some freshly troubling SERPs, got seduced and carried away by trying to improve my Report Cards (read:obsessed); did that and as a result may have contributed to getting myself buried in the SERPs and into devastating trouble for just the thing I was trying to make better.. But I'll never know for sure for all the reasons you state here. ;) maddening! But us humans want to create order out of chaos and make sense of things even when it's futile.
But seriously, I always watch for your responses in the forum because I value them, and I think I've now spent almost 45 minutes on this page. No bouncy. Someone mentioned you have since written another post on this topic. Can't wait to read!
Hi Gina!
I'm glad you found the post helpful and you're right, you sound exactly like my target audience so I'm glad you found it at all! haha
It is maddening - and there is a lot of controversy over some of the ones I mention, as well. That's because we all just don't know. We know some things matter, we know others should, we know some don't... but yeah, I don't know anyone with the keys to the whole castle.
Thank you for your very kind comments about my Q&A responses - I try to take a few minutes every day to answer a few and help people. I always feel like I could do more but I'm glad they are generally helpful. :)
Another post on the topic? I'm not sure. Not at Moz I didn't. I have one on my blog that is similar called "Think Like Google, Genius." I don't want to link it in but if you search that in quotes, you should find it. Ignore my comment about density - the post is a little old and that particular item is out of date. I still stand by most of that post, though. Once we think LIKE Google, all manner of "context clues" become available to us.
Thanks!
~Matt
Hi,
I still doubt regarding the affect of Domain Authority, Page Authority, Trust Flow, Citation Flow and PR to Google SERP after Penguin 2.1
Let me take one case study:
Keyword: “baju hamil” (Indonesian Language, mean: maternity clothes)
Search Engine: google com and google co.id
Results about 12,600,000
Site to be analized: “mimihamil (dot) com”
Position in SERP with keyword: “baju hamil” is around 15
Summary Site Profile:
DA=1, PA=1, PR=N/A, Citation Flow=0, Trust Flow=0, Alexa Rank=N/A, Age < 1 year
If you see at Google Result, you can find several site with much better DA, PA, Trust Flow, etc. below its position. Can you explain this phenomena?
Best Regards,
SEO Starter
Great job on this one Matt!
What are your thoughts about blog articles that have higher PR, Domain Authority, and Moz metrics than their homepage? My communication manager has stumbled on a few of those and was wondering how that could happen.
Time on page must be a decent portion of the algorithm. For example, there’s a hidden benefit to having authorship status: If a user returns to the search results after reading an author-tagged search result for a certain period of time, Google will add three additional links to similar articles from the same author below the originally clicked link. Matt McGee wrote a blog about it last September that was pretty interesting: Google Confirms Hidden Benefit Of Authorship: Bonus Links After A Back-Button Click
Thanks for sharing, stay awesome.
Hi another-Matt!
First - thanks for sharing my post on G+. I saw that!
As far as the blog articles outranking, I actually think that's common. I think blog articles get the majority of social media links (who goes to Twitter and links their homepage like "Check out this great company I am!" lol But a lot link in their latest posts - and those get picked up on non-social media sites. People also use commentluv and such to drive links back to blog posts specifically. Some blog comments are obviously to the main page - depends what you're promoting, I guess.
Yes - I agree about time on site (duration in Analytics). I remember the 2 minutes on an authored page then using the back button actually *changes* the SERPs. Again, time on site must matter or why would they think "well, 2 minutes, you like this author, let's give you more of him or her"?
Thanks again!
Hi Matt, thanks for your article. I have clients asking me the same thing re outranking competitors and since Panda/Penguin it's getting harder and harder to find an answer. However your tips give a lot more insight into what to look for. I de-optimized my own site to the point where I had to add some keywords back in as I think I overdid it. I'm going to renew my domain for a few years ahead and see if it helps too.
A few questions for you, if you don't mind:
1) Pages per visit matter. & Visit duration matters.
Does that mean you're assuming Google is tracking non Google property behaviour? If so, how?
2) "Get 200 retweets on that same link, have it shared 50 times on Facebook and now it is pinned 8 times? Yes, that will affect your ranking."
Did you isolate your tests? What was your sample size?
Interesting that you say,
"We have tested these ourselves and found that social metrics matter “some.” If you get three retweets on your link, no, it won’t help you very much. You won’t notice a bump. Get 200 retweets on that same link, have it shared 50 times on Facebook and now it is pinned 8 times? Yes, that will affect your ranking."
I agree with you to some extent although my personal opinion is that it's not the social metrics themselves that impact rankings directly, it's more the indirect factors that come as a result of the social exposure. Getting a couple of hundred Facebook likes and Tweets increases exposure, which in turn is likely to increase links (either from others sharing your link in a forum, in a blog post or from scraper sites like Topsy.com) and may increase other metrics such as brand searches or searches for the URL etc.
This is one to show the senior management, but will they get it? It can take some explaining.
I think the scraper sites are the big one. Sites like Topsy, ManageFlitter, twopcharts, etc. that measure Twitter users all help. Same with Klout, Kred, etc. Anything that pulls in your social feeds has the potential to help - and the more those links are shared, the more they are scraped and seen. It's "automated" link building without being an automated spam tool.
I guess my point was/is that if you work on social, it helps (for lots of reasons, direct or indirect). If you don't, there's no chance for it to help.
I appreciate your post, but every search engine has their own algorithm and to get ranking on that particular search engine we must follow them. If we need ranking in Google we must follow the Google algorithm otherwise our all SEO Efforts will be effortless if we don't get the desired results.
Nazre,What he is saying that it is impossible for us to know the algorithms -- one, Google's business is entirely dependent on people not knowing it and two, Google changes the algo. constantly and consistently. All we can go on is our own trial and error and learning info. from others in the industry, and sometimes what works for me might not work for you, and vice-versa.
I used Seomoz for a few months and was faced with the same issue, in terms of Seomoz analysis we were ranking better than some competitors albeit in real life this was not the case. Made me decide to stop using Seomoz as I couldn't see the added value in comparison to Google's own tools.
The biggest value I get is having OpenSite exports. Easily my favorite Moz tool. I use the SERP overlay to know how many links someone has. I also use PA & DA every day to see if it's their "overall" ranking helping them (DA) or one page they've been promoting (PA). If a site I'm competing with has a high PA, low DA, I know I can work on that. I've found it's harder to outrank sites with DA in the 60s and 70s than it is to outrank pages with PA in the 60s/70s, at least overall.
I also love FollowerWonk and have gained a few thousand twitter followers b/c of it. I haven't been using GetListed but I will. I can see a lot of value in having Moz stats and Moz tools available but I definitely understand if you use it just to see PA/DA/mR and mT how it would be frustrating to "win" most of those and lose the SERP battle.
I think a lot of site owners are looking for the "formula" to SEO success. They want to know that A+B=C every time and unfortunately, that's not how SEO works. I know it can be super frustrating for site owners, especially when they feel like they are being outranks by sites that don't deserve it. We can only control so much and do the best we can with what we know and understand.
Good point Nick. I think there *is* a formula for success, but we don't know the whole thing and have no real way of figuring it out. Our best guesses are always going to be some version of well, SEO starts with A+B+C+D+E+F+G+H+I+J+K/L*M=N We may know a bit about A, B and C but really J, K and M are speculation. F, G and H, we may not even think about.
Agree we can only do the best we know how. Thanks for the comment!
Come on guys .. you simply cannot explain why on earth some spammy sites still outranking genuine ones. Google is trying but that does not mean that bad guys [smart guys I would like to call them] would not be able to outrank others.
Nice post. But I think your point about on-page factors is a little vague. SEO folk certainly know what you're saying about keyword density in on-page content, but to the less enlightened, it may be translated to mean the actual content doesn't matter. All (hopefully all) SEOs know this to be untrue. Also untrue is to say keywords do not need to feature in the on-page content. They do, but it needs to be in a fashion that flows and educates human visitors.For the novices, this should be more clear.