Entertain the idea, for a moment, that Google assigned a quality score to organic search results. Say it was based off of click data and engagement metrics, and that it would function in a similar way to the Google AdWords quality score. How exactly might such a score work, what would it be based off of, and how could you optimize for it?
While there's no hard proof it exists, the organic quality score is a concept that's been pondered by many SEOs over the years. In today's Whiteboard Friday, Rand examines this theory inside and out, then offers some advice on how one might boost such a score.
Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about organic quality score.
So this is a concept. This is not a real thing that we know Google definitely has. But there's this concept that SEOs have been feeling for a long time, that similar to what Google has in their AdWords program with a paid quality score, where a page has a certain score assigned to it, that on the organic side Google almost definitely has something similar. I'll give you an example of how that might work.
So, for example, if on my site.com I have these three — this is a very simplistic website — but I have these three subfolders: Products, Blog, and About. I might have a page in my products, 14axq.html, and it has certain metrics that Google associates with it through activity that they've seen from browser data, from clickstream data, from search data, and from visit data from the searches and bounces back to the search results, and all these kinds of things, all the engagement and click data that we've been talking about a lot this year on Whiteboard Friday.
So they may have these metrics, pogo stick rate and bounce rate and a deep click rate (the rate with which someone clicks to the site and then goes further in from that page), the time that they spend on the site on average, the direct navigations that people make to it each month through their browsers, the search impressions and search clicks, perhaps a bunch of other statistics, like whether people search directly for this URL, whether they perform branded searches. What rate do unique devices in one area versus another area do this with? Is there a bias based on geography or device type or personalization or all these kinds of things?
But regardless of that, you get this idea that Google has this sort of sense of how the page performs in their search results. That might be very different across different pages and obviously very different across different sites. So maybe this blog post over here on /blog is doing much, much better in all these metrics and has a much higher quality score as a result.
Current SEO theories about organic quality scoring:
Now, when we talk to SEOs, and I spend a lot of time talking to my fellow SEOs about theories around this, a few things emerge. I think most folks are generally of the opinion that if there is something like an organic quality score...
1. It is probably based on this type of data — queries, clicks, engagements, visit data of some kind.
We don't doubt for a minute that Google has much more sophistication than the super-simplified stuff that I'm showing you here. I think Google publicly denies a lot of single types of metric like, "No, we don't use time on site. Time on site could be very variable, and sometimes low time on site is actually a good thing." Fine. But there's something in there, right? They use some more sophisticated format of that.
2. We also are pretty sure that this is applying on three different levels:
This is an observation from experimentation as well as from Google statements which is...
- Domain-wide, so that would be across one domain, if there are many pages with high quality scores, Google might view that domain differently from a domain with a variety of quality scores on it or one with generally low ones.
- Same thing for a subdomain. So it could be that a subdomain is looked at differently than the main domain, or that two different subdomains may be viewed differently. If content appears to have high quality scores on this one, but not on this one, Google might generally not pass all the ranking signals or give the same weight to the quality scores over here or to the subdomain over here.
- Same thing is true with subfolders, although to a lesser extent. In fact, this is kind of in descending order. So you can generally surmise that Google will pass these more across subfolders than they will across subdomains and more across subdomains than across root domains.
3. A higher density of good scores to bad ones can mean a bunch of good things:
- More rankings in visibility even without other signals. So even if a page is sort of lacking in these other quality signals, if it is in this blog section, this blog section tends to have high quality scores for all the pages, Google might give that page an opportunity to rank well that it wouldn't ordinarily for a page with those ranking signals in another subfolder or on another subdomain or on another website entirely.
- Some sort of what we might call "benefit of the doubt"-type of boost, even for new pages. So a new page is produced. It doesn't yet have any quality signals associated with it, but it does particularly well.
As an example, within a few minutes of this Whiteboard Friday being published on Moz's website, which is usually late Thursday night or very early Friday morning, at least Pacific time, I will bet that you can search for "Google organic quality score" or even just "organic quality score" in Google's engine, and this Whiteboard Friday will perform very well. One of the reasons that probably is, is because many other Whiteboard Friday videos, which are in this same subfolder, Google has seen them perform very well in the search results. They have whatever you want to call it — great metrics, a high organic quality score — and because of that, this Whiteboard Friday that you're watching right now, the URL that you see in the bar up above is almost definitely going to be ranking well, possibly in that number one position, even though it's brand new. It hasn't yet earned the quality signals, but Google assumes, it gives it the benefit of the doubt because of where it is.
- We surmise that there's also more value that gets passed from links, both internal and external, from pages with high quality scores. That is right now a guess, but something we hope to validate more, because we've seen some signs and some testing that that's the case.
3 ways to boost your organic quality score
If this is true — and it's up to you whether you want to believe that it is or not — even if you don't believe it, you've almost certainly seen signs that something like it's going on. I would urge you to do these three things to boost your organic quality score or whatever you believe is causing these same elements.1. You could add more high-performing pages. So if you know that pages perform well and you know what those look like versus ones that perform poorly, you can make more good ones.
2. You can improve the quality score of existing pages. So if this one is kind of low, you're seeing that these engagement and use metrics, the SERP click-through rate metrics, the bounce rate metrics from organic search visits, all of these don't look so good in comparison to your other stuff, you can boost it, improve the content, improve the navigation, improve the usability and the user experience of the page, the load time, the visuals, whatever you've got there to hold searchers' attention longer, to keep them engaged, and to make sure that you're solving their problem. When you do that, you will get higher quality scores.
3. Remove low-performing pages through a variety of means. You could take a low-performing page and you might say, "Hey, I'm going to redirect that to this other page, which does a better job answering the query anyway." Or, "Hey, I'm going to 404 that page. I don't need it anymore. In fact, no one needs it anymore." Or, "I'm going to no index it. Some people may need it, maybe the ones who are visitors to my website, who need it for some particular direct navigation purpose or internal purpose. But Google doesn't need to see it. Searchers don't need it. I'm going to use the no index, either in the meta robots tag or in the robots.txt file."
One thing that's really interesting to note is we've seen a bunch of case studies, especially since MozCon, when Britney Muller, Moz's Head of SEO, shared the fact that she had done some great testing around removing tens of thousands of low-quality, really low-quality performing pages from Moz's own website and seen our rankings and our traffic for the remainder of our content go up quite significantly, even controlling for seasonality and other things.
That was pretty exciting. When we shared that, we got a bunch of other people from the audience and on Twitter saying, "I did the same thing. When I removed low-performing pages, the rest of my site performed better," which really strongly suggests that there's something like a system in this fashion that works in this way.
So I'd urge you to go look at your metrics, go find pages that are not performing well, see what you can do about improving them or removing them, see what you can do about adding new ones that are high organic quality score, and let me know your thoughts on this in the comments.
We'll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.
Hi Rand,
It's difficult not to like the 3 ways to boost section of today's whiteboard. I am reminded of the last couple of patents from Navneet Panda that I looked at recently, since he was the force behind Google's High quality sites update (the "Panda' Update).
One of those focused upon repeat clicks to a site, and durations of visits to that same site, and it was about a continuation patent that Panda was an inventor on. I wrote about that one in:
Click a Panda: High Quality Search Results based on Repeat Clicks and Visit Duration
After reading that one, I took a look back further to see what else Panda had been working upon, and ran into another patent that rewarded longer durations of visits to a site (and visits to categories associated with a site), and which addressed how Google might ignore some of the "noise" that might be associated with click data (we have been told by Google spokespeople that click data tends to be too noisy to be used in rankings) but if Google can find ways to ignore that noise, then maybe it is being used in such a manner. The second Panda patent is at:
A Panda Patent on Website and Category Visit Durations
If duration and return visits are quality signals that Google is paying attention to, then finding ways to keep someone entertained on your site, and coming back are signals that may have a lot of value. :)
Bill
Great additions Bill (as always)! I agree that some click data is noisy, but so too is link data and content data and keyword data :-) I think Google's proven themselves pretty fantastic at taking noisy data and finding signals, so I find it nearly impossible to believe they don't use any type of engagement/user behavior/click data in any ways, especially given the propensity to use precisely that in the paid search rankings, and very clearly in localization and personalization biases (i.e. they obviously collect that data, and use it in some cases).
Appreciate the references, too. Good to know there's some research work they've clearly done in this direction.
I read somewhere that many big websites like Forbs got warning for penalization because they have thousands of posts that are bad ( probably for excepting not so great guest post ) so If this is the truth with posts ( which I am pretty much sure it is ) then it's probably the same with pages. Anyhow great post Rand :)
https://www.youtube.com/watch?v=EO3heV62UPs#t=23m0...
:-)
Rand, pretty interesting and a thought provoking topic this time.
All the 3 points that can boost the organic quality score are tried and tested, by you, me, Britney and many more. But, I have one another suggestion what we can do with the low performing pages is that we can try and look further to improve the quality of the content, those low performing pages have.
For example, if I am writing something about SEO fundamentals, but as there are multiple high performing pages of high authority sites on Google, my page is not going to perform that great until I have some crisp data into it. How about if I add some kind of case study within the same content, with more detailed challenges and quick solutions, and promote it excessively, it has the chance to perform equally well with other SEO topics under the same SEO subfolder.
The moral of my example is that re-purposing the existing content to add more value, can really work great rather than giving the re-directs to the low performing pages as that might affect the page loading speed for both mobile and desktop. And, Google hates that.
What are your thoughts, Rand?
Hi Himani,
This is a topic me and a colleague debate semi-regularly. His thought process is: 'some content is better than none' whereas I have advocated removing thin/uninspiring content because why spend time re-purposing content which could be contributing very little, you may have inherited from days of old or is not aligned to your goals?
My stance with my colleague is instead of adapting or building on the re-purposed content lets focus our efforts on the 10X content strategy or at least striving towards something which resembles a solid attempt, best we can. Often thin content is ill researched and easier to remove/redirect to content you've invested time into and ticks your goals.
I'm in favour of Rands suggestion but let me know your thoughts!
Yeah - if you have content that isn't performing well for visitors and isn't of high value, I'd say remove it ASAP. Not only can it potentially harm your search engine rankings, it can also create the impression among visitors that your site/brand is low quality and not worth coming back to -- that's a reputation you definitely don't want.
Cieron, thanks a lot for taking out time for reading my thoughts and considering me replying back. Firstly, I am completely in favor of Rand's suggestions like you. Also, I completely agree to the point to create 10X content strategy rather than investing time and efforts for re-purposing thin content. After all, it didn't perform because it was of no value. I understand.
There has been instances where few of my clients didn't wanted to see any kind of redirects on their website and they have their old highly-skilled developers who don't allow them to add more redirects as Google say this or that. I was out of arguments showing how redirecting thin content to the highly valuable one would be of greater value for their audience as well as search engine rankings. But, all were in vein.
So, I thought of coming up with another kind of content strategy where I actually performed really great with re-purposing, re-editing and re-sharing the content extensively that improved the performance of the blogs and it helped the clients. It took some time which it wouldn't have taken if we would have focused on creating other fresh high quality content. But, I did what I felt right then.
I still believe that removing thin content can actually boost the performance of the sub-category, sub-folder of a particular website. But, I thought to ask the subject matter expertise about what I have actually used that showed great results, whether that can be considered as one of the best practices.
Lastly, I am more keen to learn Cieron from you, what would you have been done in this kind of situation? This will help me get better idea about handling such situations.
A few points on that:
1) Redirects (at least if we're only talking about a single one), shouldn't affect Google's page load speed issues. Google will see the redirect, but it will point the searcher directly to the target page without having to hop them through the redirect.
2) Case studies may be helpful on some pages for some visitors, but it's not a universally good thing to add. Many queries don't demand case studies, and indeed it would be weird to find them on the results (e.g. if someone Google's "best zoos in the US," a case study on a zoo would be an awkward content fit for a result).
3) Re-purposing existing content is a fine thing to do in some cases, but again, isn't a universal good. I'd say to apply it only when you know you have the right content elsewhere and it really would solve a unique searcher problem by being re-purposed.
Thanks Rand for the detailed guidelines, I was actually expecting them.
Redirects, the clients for which I used re-purposing, re-editing and re-sharing the content have had multiple thin content as before hiring me, they have had hired some writer to fit their budget. So, I asked them to redirect the lowest performing ones and to invest time and efforts on the ones that can be avoided to be redirected. But, somehow, I was unable to convince them for redirection.
Case studies, totally agree.. Well, this made me laugh reading your example. Here, I was just talking about generalized things like sharing more data, facts, insights, case studies, challenges, best practices, everything that can be utilized based on the industry based audience.
For re-purposing, thanks a lot for clarification. I needed it.
I am now also looking at re-editing the existing content where redirects stuck me somewhere as re-purposing is not the ideal solution.
What do you think, Rand?
I too have this kind of issue on my own site. There are lots of pages having thin content and some of having redirect chain issues. I tried to solved it many time but still not get success. How to deal with all these type of isssues would love to know about it.
Very interesting number one. I have no idea about it.
Great post Rand,
I can tell you for sure, that irrelevant traffic won't help you and will even harm you, we removed most of our irrelevant / poor quality pages as you can see here and I saw amazing results after it
I believe this has to be part of Google's algorithm, and is tied into CTR, other website page's sucess, DA, and others. Great breakdown Rand, and even if it "doesn't exist", you can only benefit yourself by getting rid of low performing web pages on your website or improving the content and readability of them. Very interesting stuff either way, thanks for covering this topic!
Hi Rand,
Is there a way to really test for this?
And how does it relate to Panda, which is a quality algorithm?
It's not easy to test for, but CTR can be an indicator, and high bounce rates/low browse rates on your pages can be another (especially if you suspect that the query takes longer than a few-second quick look to solve). You could also consider doing usability testing on the SERP itself (i.e. asking a few or a few dozen folks to perform the search query, then choose the result they think would be best, or asking them to visit your page and talking out loud about whether it solves their query).
Panda will bust pages that essentially have NO QUALITY SCORE! LOL I can remember the days where e-commerce sites would have just literally thousands of pages with the same content or little to no content them. Doorway pages were busted, but we are talking thousands of useless pages. Essentially here Quality Score will be about fine-tuning for the keyword, having some sort of BAT (behavioral action trigger) content on the page and hopefully some sort of "conversion" for a new prospect or hard-lined conversion! It would be really hard to get this quality, but should be that thing where, "To move to at the speed of light you need to have an infinite amount of mass and infinite amount of energy!", in order to supply the market with unicorn content!
Agreed 110%! Once of the things I love to do for underperforming clients is to add content that is informative, low conversion, low competition, and extremely long tail. Stuff like "why does my home's hot water smell like lemons?", where the search volume is too low to count but a couple Quora questions show the demand is above zero. Even if it never earns any links, as soon as it starts getting some organic traffic, the rest of the site will start ranking a little bit better. I think that's a strong (anecdotal) argument for quality score.
I would love to see a quality score based on engagement metrics (GA/GSC/Other Sources) within the Moz tool. There are lots of proxies for this, but none that tie them together yet? Rand, I have to say this is one of my favorite Whiteboard Fridays ever. :-) I look forward to seeing more tests.
Agreed! I saw that Ahrefs is buying CTR + return rate data (probably from Jumpshot, which Moz also uses for clickstream data). They calculate a "return to SERPs" metric (sort of like pogo-sticking, though it isn't clear whether the searchers in those cases are clicking on another result or simply going back/doing something else) that could also be a useful tie-in for this.
Since Moz Pro connects with GA, we should be able to do a browse rate+bounce rate+time on site type engagement metric, but the problem is figuring out what your competitors' metrics are. For many SERPs, a super short, one page visit is a good thing (the searcher solved their problem fast, then left), so the signalling could get really messy... Tough to figure out how to play this without human review and analysis.
That's why it's probably a bunch of things put together. There may be an outlier in bounce rates or time on site in certain cases, or an outlier in backlinks, interaction with the page (e.g. social share counts, comments) and all kinds of other things, but if one could tie them all together it would be interesting to see if those outliers don't get washed out by other signals.
Thank you for this another useful article Rand! I appreciate the efforts you are taking for the SEO community.
I have 2 questions:
1) If I am choosing to improve the content of a low performing page, how long should I have to wait to see any improvement in Page rank & SERP rank?
2) It is easier to add new content on blog posts but if we are doing SEO for product/service pages there are some limitations like we cannot use more text, cannot use all LSI keywords, cannot target keywords in H-tags etc. It would be great if you can take a Whiteboard Friday on 'how to do On & Off page SEO for product/service pages' OR If you have already done that please share the link.
Hi Rand,
Thanks for discussing this topic here. I believe this is the thing which needs much more focus.In an AMA session at #SMX between Garry & Danny Sullivan, this was a big part of discussion on how Google might be judging the quality of any website and here you are with a fine analysis. I also saw your tweet regarding experiments on "Pogosticking" and I must say that really worked. But still a long click is the only way along with some other factors such as average page on time for a webpage for a particular query, to determine the quality aspect.
Also, Bill Slawaski wrote about a patent for this part, the quality score formula was bit different or maybe my interpretation is different. It mentioned:
Site Quality Score= Users Interest In The Site As Reflected In Users Query Directed To The Site/ Users Interest In Resources Found In The Site As Response To Query Of All Kinds.
I would also like to mention the quality attribute "EAT (Expertise, Authoritative & Trustworthy). If we meet these in any content it would surely rank better and will have better user engagement. Long clicks are no doubt the game winner, but focusing on just being click worthy shouldn't be the goal. Google has done extremely well in filtering the "NOISE" signals. But, they haven't opened up much yet.
One might have achieved awesome CTR, but as Rand says, "Correlation is not causation".Signals like EAT, CTR, Dwell Time, Pogosticking, Domain Level Score are surely part of this whole big thing Google is applying. We still have to get affirmation on these from maybe John Muller or Garry.
Hi Rand,
It was really a good article. Along with the above factors, SSL certificate play an important role in ranking better or getting a better organic quality score. Google has been actively endorsing SSL certification since 2014. In their official webmasters blog titled, “HTTPS as a ranking signal”, which was published in August 2014, they had recommended HTTPS encryption for all.
The blog also states that, “Security is a top priority for Google.We invest a lot in making sure that our services use industry-leading security, like strong HTTPS encryption by default…”, adding further that, “For these reasons, over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms…” This made it clear that websites with SSL certificate will have an advantage for SEO ranking. In 2015, Google’s Gary Illes made a statement that “HTTPS May Break Ties Between Two Equal Search Results” This pushed a lot of digital marketers towards SSL products and towards switching form HTTP to HTTPS.
I found this article on internet and believe that it has a good amount of information. You can read the full article here.
Do you think this applies to a small, local service company? We have published a few ranked blog posts that pull national traffic. As a local business, we have nothing to offer them so they get their info and leave (+95% bounce). Less than 20% of our traffic is local and would ever do business with us. Is that 80% of traffic with high bounce hurting our local ranking for other pages?
Great post again Rand. You were correct, I Googled "organic quality score", and this Moz post was No. 1 on search results, incredible! Google certainly considers a lot of metrics when ranking a page although it may not reveal it openly. Again it comes of the quality of the page, how engaging is the page content and whether the intent of the searcher is being addressed to. If you can address the above 3 requirements, no one can stop you from being on the top.
Nice :-) Glad to see my prediction came true so quickly. And yeah - agree that those are the requirements, and that, while they're hard to measure, they're immensely valuable when improved.
When it comes to SEO, I salute you Mr. Rand. We all know that ranking on Google is very unpredictable, we just have to study on it and test the waters to see results.
There can be no good marketing without a measurement in place or at least the start of a dialogue that leads to smarter decisions! Best Marketers Measure!
The fact that they have a Google Adwords Quality Score definitely makes me think their is some sort of Organic Quality Score, and I think you hit it pretty much on the head Rand! Awesome tips for boosting "Organic Quality Score" as well. Focusing on improving or getting rid of low quality pages on a website is definitely worth the time.
I always use QA as an indicator but I don't 100% rely on it. I often speak with clients who talk to me about QA like clients used to talk about Page Rank - as though it's some kind of official metric they need to work hard and focus on. They don't get it when I tell them just to keep an eye on it, not to obsess over it and to concentrate on doing good online marketing and creating great content.
Hi Rand, You mention that this likely works at 3 different levels (domain-wide, subdomain and subfolder). With this in mind do you think it is better to have subfolders containing all of the various areas of your website e.g. blog/ or /case-studies/ or is it better to have no folder structure. From what you say and from the Moz site itself the suggestion is that folders may be better.
I suppose the following factors influence this
We have had a debate about this internally and we never have any hard evidence either way so often the decision is taken to do what feels right for each individual website. We do tend to stick to single level subfolders...I suppose we just feel that is the most helpful structurally and navigationally. I think I may have answered my own question now!!
If you've got a small site with only a handful of pages, yeah, I'd consider a structure with no subfolders. However, if you want to start building up some unique sections (e.g. a blog, an about section, a products section, a news section, what-have-you), subfolders are good for organization, for indicating to users what they'll find in an area, and for Google to start creating some understanding of your site's sections.
What metric would be a good to indicate a page is low quality score? Bounce rate above 80% and at least 5 sessions in the last 30 days?
I think that a high bounce rate can mean that your page might be too good. The user can find too quick the answers to their questions and close off the page. That does not mean that your content is not good, but rather that the is no more juice to it or you´re not prompting the user to click surf around for other similar topics.
Yeah - Aida makes a good point. Some pages answer a query very quickly and that's a good thing. There's no single, hard and fast rule for this, which is what makes it such a challenging problem. Usability testing the SERP and your ranking page can help, so too can looking at metrics like CTR, bounce rate, pages-per-visit, etc. But this comes down to a lot of suggestive data that you'll need to apply intuition and thoughtful analysis on.
A lot of our repeat traffic "bounces" as they're just looking for our phone number.
Like @David Watkins notes above, dwell time and bounce rate figures really only make sense when you're also looking at the page they refer to, and examining its purpose and its call-to-action. That high bounce rate could just mean that the user clicked on that big button and went off to your youtube channel or to your external blog - just as you asked, or it could mean they landed on the page and WOAH! that's not what they wanted at all. Similarly with time - a long dwell might mean that users, like i did on this page, stuck around and watched the video. If you can't determine the cause of dwell and bounce not being what you expected, then throw in some UX testing to see if the answer to the behaviour is spotted in click/scrollmaps or user journey videos.
Great presentation! I am going to "no-index" some low performing webpages today and see what happens.
Cool! Let us know how it goes :-)
Hi Rad thanks for share
I useful trick that always use is test my post using a fake campaign in google adwords to see the quality score of a single post.
what do you think about it?
Cheers
Very interesting, in-depth take on this. I think we can too easily and often overlook the impact of directory/subdirectory authority, and how all that PageRank trickles down. And in terms of poor or low-quality pages, I did something similar due to some technical (mis)configurations, and added many pages to the robots,txt file via regex. This was the "quick fix" and soon saw a massive increase in impressions. Excellent WBF as always!
We keep stories with similar results! Thanks for sharing Mark -- I think there's something very valuable in limiting indexation to high quality pages.
Is there a limit to how many subfolders/sub-subfolders you should have in a URL? I.e., would you get noticeably different quality scores with:
website.com/blog/post-title vs.
website.com/blog/category/post-title vs.
website.com/resources/blog/category/post-title
Thanks for the help!
I don't think it's a huge deal, but my general bias would be to make it as easy for visitors as possible. It's a lot like writing sentences: avoid extraneous folders like you'd avoid extraneous words. If it doesn't add value, don't include it.
In my opinion - of course they use Relative CTR, it's the #1 ranking factor in AdWords after money, why would they use anything different for organic, they're both just ordered lists!... but no one in this industry has had an open mind about that for years so it's very refreshing to hear someone talk about it. I went into some detail as to why it would also explain the "honeymoon effect" here:
https://www.coconutheadphones.com/does-google-use-click-through-rate-as-an-organic-ranking-factor-answer-maybe/
Great article, I'm going to be implementing some of the tips. I just discovered Moz and reading through all the blogs. The Title Keyword article was so useful.
I downloaded the toolbar Moz, and it's killing me by showing a Spam score of 9/17 !! I have a simple website with absolutely no spammy links. I then searched for your video on the topic from back in 2015, where you explained the reasons in detail... I feel a bit better but not by much. I'm still stuck a bit on how to reduce that score.
Thanks again for all the Whiteboard videos!
Love the content, looking forward to implementing this one!
Off topic and In the weeds, but is their a reason this page doesn't have an H1? testing something?
Hi Rand,
Thanks for the video, I think the concept definitely makes a lot of sense, especially on the new concept side of things, if you've done good stuff in the past, chances are you're new stuff will be good too (if only musicians were the same!). We're just about to run through an audit of our own stuff, so we'll be removing and repurposing some content to see if that helps!
In one of my first inhouse jobs, I redirected nearly 50% of Blog-posts. They wrote a lot about marketing but not something interesting - just a plan, this plan said: one or better two posts each week. (I changed that plan to a great post whenever it's ready by the way) No unique value, no great insights, no good rankings - just a lot of "bla bla you read it a thousend times before"- posts.
Visibility in a lot of tools went down (less sites on page 4,5,6,7 at google..) that was a challenge to explain to the bosses. These pages never had any organic visit. Who cares? Googles reaction was pretty fast a month or two later organic traffic reached a 10x level.
Hi Rand,
Is it really necessary to remove thin content or low quality score pages. I mean if we have a small business website.
If there are pages in our dealership "site map" but there are no links to them on the website or anywhere else, would they still have an impact on SEO? You can still go to the URL if you knew the specific address and the page is live.
Hi Rand! Thank you for another enlightening Whiteboard Friday.
I've a question about deindexing low performing content: my company has a newsletter templates gallery with about 900 templates inside it, which is ranking really nice in Google (check it out here). However, the templates single pages are ranking really bad, which can be said as low-performing. Your advice is to no-index them or similar, but I suspect that it's not a good idea, because the main gallery page is performing really well just because it has a lot of poor-performing templates pages inside. What do you think?
Thanks again!
Hi Rand,
Completely agree with on the fact that Google pass on the benefit of domain reputation to newly created pages. I've worked a site which generates 5 million organic sessions a month. So if there were any new product launches and due to lack of information , we used to create THIN content pages/low volume content pages for those products but still somehow our pages used to beat the competition and our pages used to be in top 5 position.
Not only that I've observed many more things about google which are either extension to popular beliefs or I've never read anywhere , but can't cite as its need validation.
Thanks,
Harsh
Cool stuff! Finally a way how to calculatate properly something in SEO as well and not only in PPC.
The truth is that even if it is not said publicly, some form of punctuation has to exist, because otherwise, how do you consider which place should occupy each result? I think it's based on CTR (clicks + impressions) and then once the users are on the web pages, it will be based on site time, rebound percentage or page views. What do you think?
Thanks to the white boards of Fridays we can stop to think about concepts of this type.
Thank you
Hi, Rand
What about infinite scrolling navigation websites? For instance, how do you thing that google is interpreting the engagement signals for those pages. Does google "see" the pages that appear in scrolling navigation?
Thank you!
Perfect assumption. This is why Rand is the true ambassador of SEO. I think we can use canonical for the similar page than to no-index the non-performing page.
Thanks Rand!
I will do tests not indexing low performance web pages and see what happens :)
Great explained Rand!
In the past, I have created list (in excel) of pages and posts that existed on my website and assigned goal like:
1. Abc pages - For convert people
2. Xyz post - For traffic
3. Abcd post - For give away and so on...
Based on it, I could easily able to know that how they were performing and fulfilling my goal or not.
It helped me to take proper decision to assign score like whether I should remove or need improvement. It's totally upon us how we sees our success and measuring the score by using of Google analytics and Search console.
Nice points to improve organic quality score of your website. Thanks @Rand Fishkin
Hi Rand,
When I started my blog, I was concentrating more on quantity than quality of the posts. As a result, I had a lot of low quality pages that were performing very poorly.
Back then, I never cared to read the Analytics data and decided to remove these pages just because I thought that they are are not at all helpful to my visitors and they are only bringing bad reputation to my blog
But, to my surprise, when these pages were off, my search ranking started to improve considerably. I was ranking at the top position for some very competitive keywords.
So, the point is, if I feel low quality pages are ruining the experience of my visitors, it is very much possible that Google feels the same way.
And I won't be surprised if Google actually uses an algorithm that rates pages based on various parameters.
Thanks for great post. My point is if the main category page ranking well, will it help to rank pages under its.
I also got nice results with removing some low performing pages on big websites, but I was sceptical with doing the same on small websites, like those with just 10 to 20 pages total. I wonder if someone tried that?
Hi Rand,
I like this artical and hope we get Organic Quality Score as per guide. Thanks for sharing.
Hi Rand,
Thanks for this explanation! My feelings were the same since years, when Google Penguin hit one of our client's website. It was like picking some kind of content, while the others were performing well despite Penguin.
I myself was thinking a lot on discontinue the bad pages. They increase the bounce rate, they do not convert - but these 3-4 pages bring 30% of the traffic... Famous queries but people wants things for free.
I am not brave enough to delete them, just because of the traffic.
What would you suggest?
Thanks,
Maria
Great White Board Friday and as many have already mentioned I searched the term (mobile) for "organic quality score" and top organic result was this. What's more interesting is the Google Trend of it for the past 7 days...and the jump it has now.
Digging even deeper into individual pages we need to look at what we want the user to do on each page or as you mentioned subfolders (products / blog / about). On the blog we might want the time on site to be longer than say reading 'about' us content. On the products we want them to navigate through more pages (product - cart - checkout) so more pages per user. I think this plays a role as well.
Interesting ! Don't know how its possible but this is interesting thing. I am in same industry from long years, I have my own excel sheet (Like give quality score my own way ) for all seo rank factor which is very helpful for getting better ranking.
Thank for sharing Rand! Now feeling even more bummed for missing MozCon and Britneys talk. Would love to find out some more about that!
The MOZ Con videos are going for sale soon, if not already!
I've been saying this for about six month now. I don't know why it's taken me soo long to get geared into pay-per-click training but........I have to say this is a "big secret" that most don't see. Google will luv you if you build like this and stay abreast to getting your landing page relevancy correct. Essentially your on the way to becoming a Domain Authority in your niche! I guess the industry needed to hear this, I just wonder how many will actually see the value in this?!?
This sounds a lot like what the intent of "AgentRank" was in their 2011 patent. The idea that the more "engagement/metrics" a post got the more valuable the poster/site was deemed. By removing the "unwanted" or "lower quality" posts, it seems that it would boost your "AgentRank". Wonder if that's what we are seeing.
Wasn't the infamous "Fred" update attacking low quality pages? That should of been the first red flag. Also going by this, this means meta descriptions are going to have more value then we think.
The video answered the dilemma I'd been having for a couple of weeks, "what do I do with some of the low pages on my blog." Now I know. Thanks, Rand
Really interesting. Like the thought of optimising the low performing pages to get better results for all the other ones. You can almost optimise your site the same way you optimise your AdWords campaigns.. :-)
Hello Rand Bounce Rate 62.50 Percent is good or Bad. How Much it should be
Interesting post Rand!
You are a genious!!!
I can imagine when more Seos use the noindex tag for low performing sites that one fine day Google could see this as a manipulative technique and punish those domains.
I think that wouldnt happen, but i can imagine what you think. Trash-Domains with some good performing stuff indexed. That wouldn't happen.
1st - there is no reason for trash, not for google, users or anything.
2nd - it is not possible for trash domains to get a great Score for the entire Domain.
So optimizing not performing content (redirects, delete, noindex or ad value) is something I do very often. It not worth a penalty and it is not manipulative - google would say: thanks for cleaning up my index.
And (of course) there is a lot of content, wich could not perform for some reasons and needs to be noindexed - wich is not trash.
Thank you Rand, I know i say this a lot but we can always count on a good White Board Friday!
Thanks Cory!