Your rankings have dropped and you don't know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It's an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.
Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to talk about diagnosing a site and specifically a section of a site's pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we've got a fairly extensive process here, so let's get started.
Step 1: Uncover the problem
First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we're talking about a site that's 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to
A. Treat different site sections as unique segments for investigation. You should look at them individually.
A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I'm going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let's say I have a website that's dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I'm ranking for fewer of them, and it's my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.
B. Check traffic from search over time.
So I go to my Google Analytics or whatever analytics you're using, and you might see something like, okay, I'm going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that's a big traffic drop. We fell off a cliff there for these particular pages.
This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It's going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it's not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.
From there, I'm going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it's almost certainly something you've done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or '17 or '18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.
C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.
This is one of the reasons I like to rank track for even these types of queries that don't get a lot of traffic.
1. Target keywords. In this case, it might be "Denver population growth," maybe that's one of your keywords. You would see, "Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?"
2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus "Denver population growth," so My Site or MySite.com Denver population growth. If you're not ranking for that, that's usually an indication of a more serious problem, potentially a penalty or some type of dampening that's happening around your brand name or around your website.
3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.
4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I'm not ranking for this, but I am for this one ... sorry, if I'm not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It's probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.
5. site: urlstring/ So I would search for "site:MySite.com/cities/Denver." I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it's been a month. I wonder why they haven't come back. Maybe there's some sort of crawl issue, robots.txt issue, meta robots issue, something. I'm preventing Google from potentially getting there. Or maybe they can't get there at all, and this results in zero results. That means Google hasn't even indexed the page. Now we have another type of problem.
D. Check your tools.
1. Google Search Console. I would start there, especially in the site issues section.
2. Check your rank tracker or whatever tool you're using, whether that's Moz or something else.
3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you've run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.
4. Site uptime. So I might check Pingdom or other things that alert me to, "Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen," those types of things.
Step 2: Offer hypothesis for falling rankings/traffic
Okay, you've done your diagnostics. Now it's time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.
A. If rankings are up, but traffic is down...
In those cases, these are the five things that are most typically to blame.
1. New SERP features. There's a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don't get that featured snippet, you're losing out to one of your competitors.
2. Lower search demand. Like we talked about in Google Trends. I'm looking at search demand, and there are just not as many people searching as there used to be.
3. Brand or reputation issues. I'm ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They're choosing someone else actively because of reputation issues.
4. Snippet problems. I'm ranking in the same place I used to be, but I'm no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.
5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they've clicked in the past or where they're located. Location is often a big cause for this.
So for many SEOs for many years, "SEO consultant" resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now "SEO consultant" results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.
B. If rankings and traffic are down...
If you're seeing that rankings have fallen and traffic has fallen in conjunction, there's a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it's things like this:
1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you're providing just isn't good enough.
3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it's just not serving the new searcher intent.
4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you're not solving the searcher's query as well. Your user interface, your UX is not as good. Your keyword targeting isn't as good as theirs. Your content quality and the unique value you provide isn't as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.
5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn't getting indexed, or Google hasn't updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I'm having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can't see the content on my page anymore because I made some change in the technology of how it's displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.
Maybe at the server level, someone on the tech ops team of my website decided, "Oh, there's this really problematic bot coming from Mountain View that's costing us a bunch of bandwidth. Let's block bots from Mountain View." No, don't do that. Bad. Those kinds of technical issues can happen.
6. Spam and penalties. We'll talk a little bit more about how to diagnose those in a second.
7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren't satisfied by your results, maybe because their expectations have changed or market issues have changed.
Step 3: Make fixes and observe results
All right. Next and last in this process, what we're going to do is make some fixes and observe the results. Hopefully, we've been able to correctly diagnose and form some wise hypotheses about what's going wrong, and now we're going to try and resolve them.
A. On-page and technical issues should solve after a new crawl + index.
So on-page and technical issues, if we're fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we're talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it's long tail to many different pages, you're not going to see that instant traffic gain and rise as fast.
B. Link issues and spam penalty problems can take months to show results.
Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You've seen a lot of spam experts on Twitter saying, "Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today," because Google made some fix in their latest index rollout or their algorithm changed, and it's sort of, okay, well we'll reward the people for all the fixes that they've made. Sometimes that's in batches that take months.
C. Fixing a small number of pages in a section that's performing poorly might not show results very quickly.
For example, let's say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You've tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you've left 5,000 other cities pages untouched.
Sometimes Google will sort of be like, "No, you know what? We still think your cities pages, as a whole, don't do a good job solving this query. So even though these two that you've updated do a better job, we're not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section." That is a real thing that we've observed happening in Google's results.
Because of this, one of the things that I would urge you to do is if you're seeing good results from the people you're testing it with and you're pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.
D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.
So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, "Ah, we have something new to judge. Let's see how these location pages on MySite.com perform versus the old cities pages."
So I know we've covered a ton today and there are a lot of diagnostic issues that we haven't necessarily dug deep into, but I hope this can help you if you're encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We'll see you again next week for another edition of Whiteboard Friday. Take care.
Hi Rand. This was a really good WBF. My team and I watched it with great interest this morning as traffic drop assessments make up a huge chunk of our work these days. We picked up a few tips that we will incorporate into our assessments.
I did want to comment on a couple of things in the video. The first is the issue with link spam. I miss the old days of Penguin where we would do disavow work and then Penguin would update and for many cases we'd see fantastic improvements. However, in our experience, that really doesn't happen much anymore. When Google ran Penguin 4.0 in September of 2016, they made it so that it is baked into the algorithm AND it is not supposed to have a penalizing effect anymore. Rather, it simply devalues links that are unnatural so that they can't help a site. As such, we don't tend to see those great recoveries anymore.
I know there will be some people who will say that they have seen good recoveries after disavowing, but I have not seen much evidence of this since the fall of 2016. I do think that there are still some types of algorithmic link issues that could be fixed by disavowing, but the vast majority of sites are not going to be affected by these. We're currently testing some disavow work on sites in really competitive verticals, so hopefully I'll have more info on that soon. But, the main point that I wanted to make here is that, in my opinion, most sites really shouldn't be spending extensive time and money on link auditing and disavow work. If your traffic has dropped, it's usually because of something other than links.
With that said, there are certain dates on which it looks like Google tweaked Penguin to get better at figuring out which links to ignore. One example is February 1, 2017. We saw a lot of sites that had sophisticated link schemes that worked in the past and then appeared to have less effectiveness on that day. Those sites saw a steady decline after February 1, 2017 as Google devalued more and more links. Those sites will not see a nice recovery after disavowing though as disavowing won't cause them to get that lost PageRank back.
I also wanted to comment on the section of the video that says that in most cases you should see improvements after the site/pages are recrawled. I find that this is only true for some limited cases. Some algorithms like the keyword stuffing algorithm and possibly the page layout algorithm likely run each time the page is crawled. But, many take much much longer.
In my experience, if a site is hit by a quality update, in order to see improvements there have to be dramatic changes to content, technical issues, etc. I really liked your point about not expecting to see changes after fixing just two of thousands of pages. In most cases, core quality updates seem to be a sitewide thing. So, in order to see ranking and traffic improvements, we have to make the entire site (or at least a vast majority of the site) SIGNIFICANTLY better.
For clients of ours that have been able to make those significant changes, it usually takes months to see the results, which can be frustrating. This can be even longer if you have a large site.
Here are a few other tips that I can add (based on doing hundreds of traffic drop assessments):
(Continued in my next comment as I'm over the character limit!)
...continued:
-While we don't see as many manual actions these days as we used to, they still exist. Be sure to check Google Search Console --> Search Traffic --> Manual actions to make sure that there is nothing in there. It's also a good idea to check the security section of GSC.
-If your traffic drops, isolate Google organic traffic in Google Analytics to see if the drop is happening there, or whether it is somewhere else. I can't tell you how many times a site owner has come to us because they've seen a traffic drop and it turns out that their PPC campaigns stopped running, or they stopped getting referrals from a lucrative source. In one recent site we reviewed, they had slowed down on social media posting and email marketing and that was the cause for the traffic drop; their Google organic traffic was fine.
-If you do see a traffic drop in Google organic traffic, is it also there in Bing and Yahoo organic? If so, it's not likely to be due to a Google algorithm change. In these cases I'd be looking for technical issues affecting the site.
-Pay close attention to Google's Quality Raters' Guidelines. We have found that since February of 2017, most algorithmic changes that were significant enough for us to notice on a widespread basis seem to cover things that are mentioned in the QRG. For example, on February 7, 2017, we saw loads of sites drop because of a lack of experience, authoritativeness and trust (E-A-T). One example was a site that offered legal advice, but none of their content was written by lawyers or anyone with actual legal experience. The same is true of medical sites and other your money or your life (YMYL) sites as well. March 8, 2017 (Fred) was another date on which this type of thing happened as well.
-Take a really close look at your competitors and perhaps get people who aren't a part of your company to compare your site to competitors. I really feel that Google is getting better and better at determining which sites provide the best experience to users. Let's say you have an eCommerce site that sells green widgets. In the past, if you optimized your page well and were able to get some links to it, it would probably perform now. But now, what we're seeing is that pages without a big link profile can outrank the well optimized pages. Often those well ranking pages are super valuable to people. The green widgets page on our site might have a unique product description, but theirs also has a bunch of reviews, some guides on how to use the products, better photos and even videos that help me with my purchase decision. If you're an eCommerce store that is seeing a slow decline, or a decline connected with a quality update, there's a good chance that Google is not thinking that your site is the best of its kind.
-Make good use of fetch and render. We've seen pages that dropped and when we render the page half the content is missing. This is often the case with fancy scripts and tools that make the page look snazzy, but make it hard for search engines to assess content.
I'll leave it there for now, as this should probably be a blog post on its own. :)
Thanks again for this WBF Rand.
Marie
Thanks for your clear explaination
if google keep reading (link to your side) and gives you zero what dose that mean? sorry to interrupt and come out by other question
You mean you're seeing zero links to your site in Google Search Console? There are two main reasons for this:
1) If you've just set up GSC it can take a few days or even up to a week for the links to appear.
2) You may be looking at the wrong variant in GSC (i.e. http rather than https).
If you have a new site, it's also possible that Google hasn't found any links worth counting too, but usually there are at least a few in there.
Hi Marie,
If Google Search console says N number of links while other tools say very minimal links, which one is to be trusted?
None can be 100% trusted. But there's other way to test, read below.
Our test - we onboarded a huge medical practice as a new client with a website that supposedly had only 40 backlinks according to google console. We found 70 others from high domain ranking sites from 6 months back. Moz Pro only recorded 30 of those other 70. SEMrush recorded only the 40 that google console did.
Your best bet is to record the links you're building on a spreadsheet. Historical backlinks ... I'm not convinced there's a good way to track them.
The awesome-sauce is how we were able to get all 70 of those backlinks to be credited 120 days later. Console, Moz and SemRush now see all 110.
The funny part though, after all the optimization, all the backlinks, and local SEO to a very specific high-conversion page we were able to get to result 6-8 (fluctuating) from page 3 to page 1... but it's clear that the average life of that page is a key factor in rankings. So... you may have to wait at least 12-14 months before you can see that result move from rank 6 to rank above the fold on google search results.
The cool thing, this is what you can do: we saw our Google Adwords CPC drop from $35 to $4 thanks to all the backlinks.
So remember, if you're doing something related to backlinking, don't wait to show results to your clients a year or two later. Put up Google Adwords and watch that CPC drop like crazy. You'll know EXACTLY when the backlinks are credited (down to the hour of the day)
Hi Simon, gaining extra links to your page was not the cause of your AdWords CPC drop. There is no connection between organic ranking factors such as links and your AdWords account.
I Agree Blaine.
Simon, Did you analyse if there were any featured snippets you got for the business which might have lead to increased conversions over other channels. I guess it might have something to do with assisted conversion reports too. A more granular analysis would do all the work in this case.
Hi Amit,
None of the tools are super accurate. Usually the opposite happens though - You may see thousands of links on the commercial providers and only a few hundred on GSC. When this happens it's often because ahrefs, majestic, etc are showing many different url variations.
It's hard to say which report to "trust". I'd likely look at the links that GSC is showing you and see if they make sense. If you have had a sudden influx of links when you weren't expecting it, it can sometimes be a sign that the site has been hacked.
Thanks For Reply, Marie!
I have made this very clear and I do trust the data from GSC. But sometimes its hard to make your clients believe the same. They consider freeware as something not so authentic and trust paid tool results.
I am strongly agree with your research...I also have same case study of my personal research.
Thanks Marie, great addition to a good thorough WBF.
I'm just wondering about your paragraph on Google's Quality Guidelines. When you noticed a lawyers site that did not have content written by a lawyer was this because the content was low quality? How do you suppose google knew it wasn't written by a lawyer?
As in was the content low quality because a laywer had not written it or because it was not all correct or to the correct standard?
This is a great process that we have definitely implemented certain steps on for clients (and ourselves!). Such a great topic for anyone who thinks they are doing everything right, but are still not ranking well. Thanks for covering this in such great detail Rand and for providing simple steps that anyone can follow to diagnose the ranking issue.
Thanks Rand for this informative article and your efforts. My one of blog was the victim of dropping ranking and traffic and it was massive. when we investigate turns out someone use shady practices and pointed a bunch of spammy backlinks towards our website to negatively impact our rankings. so i think spammy backlink should need to be check also.
But even the impact of that has been reduced a bit. These days, Google tends to just discount the power of backlinks from spammy sites rather than penalising your site. So basically, if you have 10 spammy backlinks pointing to your site, Google just won't attribute as much value to them, rather than penalising you for having them, if that makes sense.
This is total my opinion and everyone has right to disagree. i am just trying to help :-)
You're right Junaid, dodgy or bad links are no longer supposed to derank a website, at least not as much as they used to. Instead the poor links themselves are devalued, meaning any beneift of having said 'dodgy link' is removed.
That being said I think your complete backlink portfolio is considered as a whoole by Google. If there are a considerable number of dodgy links compared to your number high quality/authority, relevant links than this could be problematic.
Great post and comment though, thanks!
The bad Links normally should be devaluated. But i saw a few of my websites dropping in rankings after a spam attack with porn backlinks. Also had some "interesting" rankings for my sites :D.
So in my opinion you always should check your backlink profile :)
Yea, with the penguin update, it's supposed to devalue these poor links versus penalize the site. I still submitted a disavow file after some timely research. I think it shows to Google that you are willing to get rid of these crappy links and are making an effort to improve your backlink profile, so in theory, they should reward you for your efforts.
Love the helpful and practical tips under the diagnostic queries section.
Thanks for a great WBF, Rand!
As usual "Whiteboard Friday" rocks, I'm working on a project related to this topic, that becomes this articles in something really useful.
Wow, great way to analyze why a page is not ranking and/or dropping in rankings. We often first look at potential technical issues like you mentioned in Step 2, or if rankings are dropping, we check out what the online competition has been doing (Moz Pro is great for this as well, btw). Wonderful article and solid tips!
Can someone expand on Step1: 3b vs 3c? What are we looking for when searching a long string WITH and WITHOUT quotes?
HI everyone,
I've been having an issue with a severe drop in rankings (#2 to #36ish). All of my technicals seem to be ok, however I seem to be getting my images hotlinked (which I have killed in nginx) from these spam like pages that pull and link to an image on my site, then link again with a " . " for the anchor. Even more strange is that these pages are titled and marked up with the same titles and target key words as my site. For example, I just got a link yesterday from a site leadoptimiser . me which is IMO a junk site. The title of the page is the same as one of my pages, the page is pulling in images relevant to my page, however the image sources are repos EXCEPT for 2 images from my site which are hotlinked to my pages image and then an additional <a>.</a> link is placed to my website.
I have gotten over 1500 of these links in the past few months from all different domains but the website (layout etc) is always the same. I have been slowly disavowing some of them, but do not want to screw up anything in case these links are already being discounted by G as spam and not affecting my rank. The community seems to be really split on the necessity of disavowing links like these. Because of these links, according to Ahrefs, my backlink profile is 38% anchor text of "." .
Everything else checks out in both my own review as well as Moz tools and Ahrefs with very high quality scores etc. Webmasters is fine, indexing is fine, pagespeed insights is in the 90's, ssl is A+.
Any words of wisdom?
Thanks Rand for all of your WBF's they have been a staple for myself and staff for years!
Hi Zach, thank you so much for commenting! :) Sometimes detailed questions like this can get a bit lost in the blog comments section, so I'd also recommend posting this in the Moz Q&A (https://moz.com/community/q) if you find you still need some help here; it'll get a bit more visibility, and the Q&A is devoted to crowdsourcing answers to questions just like this! Hope that helps a bit and have a great day.
In my opinion, links like this rarely hurt a site. I'd be looking for other causes for the drop.
I have had great success with 301s for city-specific pages as Rand mentioned above in Step 3:D, for a service industry website. After transitioning from HTTP to HTTPS we did see a significant dip for about 2 months which slowly rose back up - almost. After setting 301s to dozens of city-specific pages to newer URLs, we noticed a pretty quick increase in traffic and CTR. My 2 cents!
Hi apart from SEO angle what I liked about this article is the other perspectives which also needs to be looked into like brand name or new things that your competitors may be offering these are angles we often forget to look into
Hi Rand,
That's very insightful post. We come across such relevant issues routinely where a webpage is ranking pretty good for the relevant queries, but just not getting traffic. We have been successful in boosting CTR of such pages to a certain extent by changing Title, meta description, and page content (a little bit).
Apart from these, I have seen few issues where a website loses out rankings in small span of time because of these reasons.
Hope these would help folks in some way.
Thanks
Hi Praveen,
I also realized that making adaptions to my title/meta and page content did help a little bit but rankings still are not the same as they previously were. Might take some more time. Often times I feel like it is a combination of a couple factors that affect your ranking.
Totally agree with you, Abel.
The experiment we did was with the pages that were ranking either on the first page or second page but were not attracting user clicks. After changing the title and meta description part, we actually did notice more visits from those pages and a little push in the rankings. :)
Thanks
Helpful material.
You've discussed one point Spam score. How can I reduce the spam score for any website?
Great timing Rand - I had a client that had rankings drop last week which I couldn't figure out. We'll see in the coming weeks if I have learned correctly from you.
Cheers once again for the great content.
Quite a post,. After reading this I realized I was missing some major points in both On and Off-page SEO. I’ll give my best in using your recommendations to improve my online reputation and I’ll get back to you with the result. I would like your perspective on how to check the trustrank of my website?
Thanks for sharing this article with us. I need this information for ranking up my website. but I have one question in mind is that If our website have good traffic as compare to competitors but not ranking then what is the reason behind it? I know the answer but I want to know from your side.
Thanks rand.. i've found the information so useful for my website, as these days, i am also struggling to get my rankings well in the SERP results. I also wanted to replace my page with a new URL and content.
But the question in my mind is, if I do create a new page and 301 redirect older page to it. Won't it be getting the adverse effect from the old landing page? (As, according to my diagnosis, the issue is with my very old backlinks).
Hey Rand!
Thanks for another great WBF
Recently one of my client faced the issues where the traffic got dropped from mobile device and I recommended them to implement AMP, and now they see better improved traffic from mobile device too. I always keep eye on:
These few set of steps help me in identifying if their is drop or rise in website performance.
Cheers!
Hello Rand,
Great checklist to watch out for this 2018. Yes I would add site speed, the overall bad links pointing to that page and low quality contents can really devalue your page's authority and thereby decrease the overall visibility with very bad ranking in search engines. Targeting keywords which your website is capable of receiving a ranking factor is another important aspect which really affects your sites ranking and search engine visibility.
Thank you Rand, and expecting another awesome post.
Hi Rand, thank you for the insightful and didactic video.
This post is really helpful, thank you.
Hey Rand,
I always prefer to have a look on white board image first and then directly going to the points i don't know.
Love the way you present :)
Intuition Softech - App development company
That´s great but in the most of our clients making all this steps are imposible due to lot of time that implies to. We can try in our proyects.
I checked for above said issues. My website content is indexed, no duplicate...Everything fine. But don't know why not ranking top (ranked top till last month).What else I can do?
Thanks for sharing Rand, you are always producing great resources. And some awesome comments!
One discussion to add on Step 2 where you mention brand issues.
I have used a similar processes that you have in Step 1 to dive into diagnosing brand problems for some of my clients when we started to see rankings and non-brand keyword search volume stay stable, but large drops in traffic.
The section of the website to study is the homepage. Generally, the homepage ranks for most of your branded terms. Studying the performance of that page and comparing it to performance from Google Trends, other tools, and even paid branded campaigns has helped build out some strong insights on brand performance.
Search Console has proven to be an invaluable diagnostic tool in the last few years. Agreed?
This was a super helpful post - as always. I have long term high ranking clients who I tend to spend more time on blogging, and social media. This is an important exercise, to identify areas that may be struggling and you don't realize it. Better to be pro-active than wait til a client says "how come I'm not on page 1 for such and such anymore".
Hi Rand, such a great WBF!
Over the years I had to learn to always monitor our main 50-60 competitors and assess and improve our pages accordingly. In fast changing industries like ours it is imperative to do this otherwise we would be loosing our rankings very fast. We are following quite a few steps you presented but I've learn still a lot, so thank you!
I keep this post in case I ever need it, although I hope I won't have to resort to it!
Thanks for another really helpful article Rand! Over the last few weeks I encountered a loss in rankings with my website. Will most certainly go through your mentioned possibilities of why a site does not rank well anymore and hope to find the problem!
Good post Rand! As you say there are tons of factors that can affect your SERP performances, but you have give us a really valuable insight on the most important issues that can ruin our ranks.
Thanks so much for the info, and please keep helping us to grow ;-)
Hi Rand, Thank you for the article. However, can the same information apply to product specific ecommerce sites? or sites which already have a fairly good ranking but traffic remain the same.
This is very helpful and I agree with all points, but I'm curious how this translates into the ecommerce world with hundreds to tens of thousands of "thin" product pages.
It's pretty common for clients to come to us with a particular product category that ranks on the 1st/2nd pages, but a slew of other categories that don't even break the top 100.
While we've seen reliably solid sitewide improvements after developing content marketing strategies & implementing on-site fixes, part of me wonders if Google has recently imposed some sort of algorithm changes on ecommerce sites, limiting how many categories small- to medium-sized ones appear for. Is that suspicion totally unfounded or might there be something there?
That's definitely going to help and diagnose why your site/pages not ranking well. We trying to rank our site in international market and we know it will take time people to know more about us and start trusting our brand which I believe gives you an advantage over others and also helps in CTR if some known site rankings against new one.
Content quality even matters more now as you should produce content that will help users to accomplish their intent or reason to visit your pages. User experience should be at par with your competitor or if possible provide better tools and features to make their task even easier.
There always going to be some bottleneck / hurdles like product changes, dev requirement, getting approvals and explaining every stakeholder why you want to do it and how it will help. But I guess if you are not trying you never know what you are missing.
Randal with the knowledge most can't handle....sorry had to but for real this was an amazing post. I think mostly due to the time when a lot of us are head scratching on why we lost traffic or not ranking in general. You recommended many awesome fixes and also analysis on what to do if faced with these problems. I find also that checking webmaster tools for both is my personal first stop in "Ranking No More Land" and from there I will use a crawler to check the site. One area I have lacked though is diagnostic queries which makes so much sense to see where you are at in search.
The last tip of sometimes just totally replacing and redirecting is genius if you ask me because there are times when on page fixes will not do the trick so just switching it up 100% in terms of url ect is the way to go.
Thanks as always for these great Friday tips and tools of the trade.
hi ,Im iranian ,I can't use seo moz pro -on-page report card
my country is in sanction so I have very problem with this issue
please help me
Man, this blog post is amazing.
I have a drop recently and i will wath it with your suggestions.
Thanks!
What would you say would be the cause of a website that is indexed, optimized (to a point) but does not ANY traffic OR now being ranked to any keywords being targeted? diagnose rank the websites with a decent rank, work has to be done but overall in good condition and still NO traffic at all.
Thanks Rand. Another super useful WBF. IMO this is the best one I've recently watched.
Anyone has a tip of a video or tutorial(s) where I could deepen on this? For both individual pages and set of pages Thx
Hey Rand, great post again! I haven't missed a WBF all year, almost the equivalent a a bachelor degree ;) !
Anyhow, I've just learned that you were coming to Quebec City for a conference in April. We're lucky to have you cooming to our town, I won't miss it for sure!
Also, it might look simple, but how can we see WHEN google last indexed a page?
Thanks again for the post.
Perfect piece of information to analysis the website performance
I guess content and metas are the reason of drop of ranking in 90% cases and spammy backlink are the reason in rest 10% cases.
Most of people don't fully optimized the content according to the keywords which are being used in metas.
Hi guys, I hope someone can offer soem advice in optimizing a meta tile and description:
I have a travel website tat sells only customized private Morocco tours and with good metrics, but it seems I am missing something else as the page is not ranking on page 1 and I am stuck on page 2, I believe is to do with my title and description.
great article Rand, I asked couple question but it seems no one is answering here
Hi Asmoun, thanks for commenting on the blog! Detailed questions can get a bit lost in the comments section, and they don't get much visibility. I'd actually recommend heading to the Moz Q&A (https://moz.com/community/q) and asking your question there; the Q&A is devoted to crowdsourcing solutions to questions, and should be the perfect place for you to find some answers. :)
Hope that helps, take care for now!
Hey Rand, these are the great checklists to see pages / posts performance.
Do you know my website already in featured snippet for high competitive keyword and the traffic I am not getting as I expected because of Google ads.
Advertisers are paying for that keyword and taking top slot above me - this is another reason that we can't control for traffic even though we are in rank.