This past March, I was contacted by a prospective client:
My site has been up since 2004. I had good traffic growth up to 2012 (doubling each year to around a million page views a month), then suffered a 40% drop in mid Feb 2012. I've been working on everything that I can think of since, but the traffic has never recovered.
Since my primary business is performing strategic site audits, this is something I hear often. Site appears to be doing quite well, then gets slammed. Site owner struggles for years to fix it, but repeatedly comes up empty.
It can be devastating when that happens. And now more than ever, site owners need real solutions to serious problems.
As this chart shows, when separating out the "expected" roller coaster effect, Google organic traffic took a nose-dive in early February of 2012.
First step: check and correlate with known updates
When this happens, the first thing I do is jump to Moz's Google Algorithm Change History charts to see if I can pinpoint a known Google update that correlates to a drop.
Except in this case, there was no direct "same day" update listed.
A week before, there's an entry listed that references integrating Panda into the main index more, however discussion around that is this change happened sometime in January. So maybe it's Panda, maybe it's not.
Expand your timeline: look for other hits
At this point, I expanded the timeline view to see if I could spot other specific drops and possibly associate them to known updates. I did this because some sites that get hit once, get hit again and again.
Well now we have a complete mess.
At this point, if you're up for the challenge, you can take the time to carefully review all the ups and downs manually, comparing drops to Moz's change history.
Personally, when I see something this ugly, I prefer to use the Panguin Tool. It allows you to see this timeline with a "known update" overlay for various Google updates. Saves a lot of time. So that's what I did.
Well okay this is an ugly mess as well. If you think you can pin enough of the drops on specific factors, that's great.
What I like about the Panguin Tool is you can "turn off" or "hide" different update types to try and look for a consistent issue type. Alternately, you can zoom in to look at individual updates and see if they align with a specific, clear drop in traffic.
Looking at this chart, it's pretty clear the site saw a second dropoff beginning with Panda 3.3. The next dropoff aligned with updates appears to be Panda 3.4, however it was already in a slide after Panda 3.3 so we can't be certain of this one.
Multiple other updates took place after that where there may or may not have been some impact, followed by further cascading downward.
Then, in the midst of THAT dropoff, we see a Penguin update that also MAY or MAY NOT have played into the problem.
The ambiguous reality
This is a great time to bring up the fact that one of the biggest challenges we face in dealing with SEO is the ambiguous nature of what takes place. We can't always, with true certainty, know whether a site has been hit by any single algorithm change.
In between all the known updates, Google is constantly making adjustments.
The other factor here is that when a given update takes place, it doesn't always roll out instantly, nor is every site reprocessed against that latest change right away.
The cascading impact effect
Here's where evaluating things becomes even more of a mess.
When an algorithm update takes place, it may be days or even weeks before a site sees the impact of that, if at all. And once it does, whatever changes to the overall status of a site come along due to any single algorithm shift, other algorithms are sometimes going to then base formulaic decisions on that new status of a site.
So if a site becomes weaker due to a given algorithm change, even if the drop is minimal or non-observant, it can still suffer further losses due to that weakened state.
I refer to this as the "cascading impact" effect.
The right solution to cascading impact losses
Okay so lets say you're dealing with a site that appears to have been hit by multiple algorithm updates. Maybe some of them are Panda, maybe others aren't Panda.
The only correct approach in this scenario is to step back and understand that for maximum sustainable improvement, you need to consider every aspect of SEO. Heck, even if a site was ONLY hit by Panda, or Penguin, or the "Above the Fold" algorithm, I always approach my audits with this mindset. It's the only way to ensure that a site becomes more resilient to future updates of any type.
And when you approach it this way, because you're looking at the "across-the-board" considerations, you're much more likely to address the actual issues that you can associate with any single algorithm.
The QUART mindset
It was at this point where I began to do my work.
A couple years ago, I coined the acronym QUART—what I call the five super-signals of SEO:
- Quality
- Uniqueness
- Authority
- Relevance
- Trust
With every single factor across the full spectrum of signals in SEO, I apply the QUART* test. Any single signal needs to score high in at least three of the five super-signals.
Whether it's a speed issue, a crawl efficiency issue, topical focus, supporting signals on-site or off-site, whatever it is, if that signal does not score well with quality, uniqueness or relevance, it leaves that page, that section of a site, or that site as a whole vulnerable to algorithmic hits.
If you get those three strong enough, that signal will, over time, earn authority and trust score value as well.
If you are really strong with relevance with any single page or section of the site, but weak in quality or uniqueness, you can still do well in SEO if the overall site is over-the-top with authority and trust.
*When I first came up with this acronym, I had the sequence of letters as QURTA, since quality, uniqueness, and relevance are, in my opinion, the true ideal target above all else. New sites don't have authority or trust, yet they can be perfectly good, valuable sites if they hit those three. Except Jen Lopez suggested that if I shift the letters for the acronym, it would make it a much easier concept for people to remember. Thanks Jen!
A frustratingly true example
Lets say you have one single page of content. It's only "okay" or may even be "dismal" in regard to quality and uniqueness. Even if you do, and if the site's overall authority and trust are strong enough, you can outrank an entire site devoted to that specific topic.
This happens all the time with sites like Wikipedia, or Yahoo Answers.
Don't you hate that? Yeah, I know—Yahoo Answers? Trust? Ha!
Sadly, some sites have, over time, built so much visibility, brand recognition, and trust for enough of their content, that they can seemingly get away with SEO murder.
It's frustrating to see. Yet the foundational concept as to WHY that happens is understandable if you apply the QUART test.
Spot mobile site issues
One challenge this site has is that there's also a separate mobile subdomain. Looking at the Google traffic for that shows similar problems, beginning back in February of 2012.
Note that for the most part, the mobile site suffered from that same major initial hit and subsequent downslide. The one big exception was a technical issue unique to the mobile site at the end of 2012 / beginning of 2013.
Identify and address priority issues
Understanding the QUART concept, and having been doing this work for years, I dove head-first into the audit.
Page processing and crawl efficiency
NOTE: This is an educational site – so all "educational page" labels refer to
different primary pages on the site.
For my audits, I rely upon Google Analytics Page Timings data, URIValet.com 1.5 mbps data, and also WebPageTest.org (testing from different server locations and at different speeds including DSL, Cable and Mobile).
Speed improvement goals
Whenever I present audit findings to a client, I explain "Here's the ideal goal for this issue, yet I don't expect you to hit the ideal goal, only that you do your best to make improvements without becoming bogged down in this one issue."
For this site, since not every single page had crisis speed problems, I was looking to have the site owner at least get to a point of better, more consistent stability. So while there's still room for vast improvement, the work performed went quite far in the right direction.
Speed issues addressed: domain and process calls
The first issue tackled was the fact that at the template level, the various pages on the site were calling several different processes across several different domains.
A great resource to use for generating lists of what third party processes individual pages use that I rely upon is a report in the WebPageTest.org results. It lists every domain called for the page tested, along with total processes called from those, and gives separate data on the total file sizes across each.
Reducing the number of times a page has to call a third party domain, and the number of times an individual process needs to be run is often a way to help speed up functionality.
In the case of this site, several processes were eliminated:
- Google APIs
- Google Themes
- Google User Content
- Clicktale
- Gstatic
- RackCDN
By eliminating functionality that was dependent upon third party servers meant less DNS lookups, and less dependance upon connections to other servers somewhere else on the web.
Typical service drains can often come from ad blocks (serving too many ads from too many different ad networks is a frequent speed drain culprit), social sharing widgets, third party font generation, and countless other shiny object services.
Clean code
Yeah, I know—you don't have to have 100% validated code for SEO. Except what I've found through years of this work, is that the more errors you have in your markup, the more likely there will be potential for processing delays, and beyond that, the more likely search algorithms will become confused.
And even if you can't prove in a given site that cleaner code is a significant speed improvement point, it's still a best practice, which is what I live for. So it's always included in my audit evaluation process.
Improve process efficiency
Next up on the list was the range of issues all too many sites have these days regarding efficiency within a site's own content. Tools to help here include Google Webmaster Tools, Google Page Speed Insights, and again WebPageTest.org among others.
Issues I'm talking about here include above-the-fold render-blocking JavaScript and CSS, lack of browser caching, lack of compression of static content, server response times, a host of code-bloat considerations, too-big image sizes, and the list goes on...
NOTE: This is an educational site – so all "educational page" labels refer to
different primary pages on the site.
Note: Google Page Speed Insights recommendations and WebPageTest.org's grade reports only offer partial insight. What they do offer however, can help you go a long way to making speed improvements.
Also, other speed reporting tools abound, to differing degrees of value, accuracy and help. The most important factor to me is to not rely on any single resource, and do your own extensive testing. Ultimately, enough effort in research and testing needs to be performed with followup checking to ensure you address the real issues on a big enough scale to make a difference. Just glossing over things or only hitting the most obvious problems is not always going to get you real long-term sustainable results...
Correct crawl inefficiency
Another common problem I find is where a site evolves over time, many of the URLs change. When this happens, site owners don't properly clean up their own internal links to those pages. The end result is a weakening of crawl efficiency and then user experience quality and trust signals.
Remember that Google and Bing are, in fact, users of your site. Whether you want to admit it or not. So if they're crawling the site and run into too many internal redirects (or heaven forbid redirect loops), or dead ends, that's going to make their systems wary to want to bother continuing the crawl. And abandoning the crawl because of that is not helpful by any stretch of the imagination.
It also confuses algorithms.
To that end, I like to crawl a sampling of a site's total internal links using Screaming Frog. That tool gives me many different insights, only one of which happens to be internal link problems. Yet it's invaluable to know. And if I find enough of a percentage of that sample crawl URLs are redirecting or dead ends, that needs to get fixed.
Note: for reference sake, the total number of pages on the entire site is less than 5,000. So that's a lot of internal inefficiency for that size site...
Don't ignore external links
While having link redirects and dead ends pointing to outside sources isn't ideal, it's less harmful most of the time than internal redirects and dead ends. Except when it's not.
In this case, the site had previously been under a different domain name prior to a rebranding effort. And after the migration, it resulted in some ugly redirect loops involving the old domain!
Topical focus evaluation
At this point, the audit moved from the truly technical issues to the truly content related issues. Of course, since it's algorithms that do the work to "figure it all out," even content issues are "technical" in nature. Yet that's a completely different rant. So let's just move on to the list of issues identified that we can associate with content evaluation.
Improve H1 and H2 headline tags
Yeah, I know—some of you think these are irrelevant. They're really not. They are one more contributing factor when search engines look to multiple signals for understanding the unique topic of a given page.
Noindex large volume of "thin" content pages
Typical scenario here: a lot of pages that have very little to no actual "unique" content—at least not enough crawlable content to justify their earning high rankings for their unique focus. Be aware—this doesn't just include the content within the main "content" area of a page. If it's surrounded (as was the case on this site) by blocks of content common to other pages, or if the main navigation or footer navigation are bloated with too many links (and surrounding words in the code), and if you offer too many shiny object widgets (as this site had), that "unique" content evaluation is going to become strained (as it did for this site).
Add robust crawlable relevant content to video pages
You can have the greatest videos on the planet on your site. And yet, if you're not a CNN, or some other truly well established high authority site, you are almost always going to need to add high quality, truly relevant content to pages that have those videos. So that was done here.
And I'm not just talking about "filler" content. In this case (as it always should be) the new content was well written and supportive of the content in the videos.
Eliminate "shiny object" "generic" content that was causing duplicate content / topical dilution confusion across the site
On pages that were worth salvaging but where there was thin content, I never recommend throwing those out. Instead, take the time to add more value content, yes. But also, consider eliminating some of those shiny objects. For this site, the reduction of those vastly improved the uniqueness of those pages.
Improve hierarchical URL content funnels reducing the "flat" nature of content
Flat architecture is an SEO myth. Want to know how I know this? I read it on the Internet, that's how!
Oh wait. That was ME who said it.
Seriously, though. If all your content looks like this:
www.domain.com/category
www.domain.com/subcategory
www.domain.com/productdetailpage
That's flat architecture.
It claims "every one of these pages is as important as every other page on my site.
And that's a fantasy.
It also severely harms your need to communicate "here's all the content specific to this category, or this sub-category". And THAT harms your need to say "hey, this site is robust with content about this broad topic".
So please. Stop with the flat architecture.
And no, this is NOT just for search engines. Users who see proper URL funnels can rapidly get a cue as to where they are on the site (or as they look at that in the search results, more confidence about trust factors).
So for this site, reorganization of content was called for and implemented.
Add site-wide breadcrumb navigation
Yes—breadcrumbs are helpful. because they reinforce topical focus, content organization, and improvements to user experience.
So these were added.
Noindex,nofollowed over 1,300 "orphaned" pages
Pop-up windows. They're great for sharing additional information to site visitors. Except when you allow those to become indexable by search engines. Then all of asudden, you've got countless random pages that, on their own, have no meaning, no usability, and no way to communicate "this is how this page relates to all these other pages over there". They're an SEO signal killer. So we lopped them out of the mix with a machete.
Sometimes you may want to keep them indexable. If you do, they need full site navigation and branding, and proper URL hierarchical designations. So pay attention to whether it's worth it to do that or not.
Remove UX confusing widget functionality
One particular widget on the site was confusing from a UX perspective. This particular issue had as much to do with site trust and overall usability as anything, and less to do with pure SEO. Except it caused some speed delays, needless site-jumping, repetition of effort and a serious weakening of brand trust. And those definitely impact SEO, so it was eliminated.
Noindex internal site "search" results pages
Duplicate content. Eliminated. 'nuff said?
Eliminate multiple category assignments for blog articles
More duplicate content issues. Sometimes you can keep these, however if multiple category assignments get out of hand, it really IS a duplicate content problem. So in this case, we resolved that.
Unify brand identity across pages from old branding that had been migrated
Old brand, new brand—both were intermingled after the site migration I previously described. Some of it was a direct SEO issue (old brand name in many page titles, in various on-site links and content) and some was purely a UX trust factor.
Unify main navigation across pages from old branded version that had been migrated
Again, this was a migration issue gone wrong. Half the site had consistent top navigation based on the new design, and half had imported the old main navigation. An ugly UX, crawl and topical understanding nightmare.
Add missing meta descriptions
Some of the bizarre auto-generated meta descriptions Google had been presenting on various searches was downright ugly. Killed that click-block dead by adding meta descriptions to over 1,000 pages.
Remove extremely over-optimized meta keywords tag
Not a problem you say? Ask Duane Forrester. He'll confirm—it's one of many signal points they use to seek out potential over-optimization. So why risk leaving them there?
About inbound links
While I found some toxic inbound links in the profile, there weren't many on this site. Most of those actually disappeared on their own thanks to all the other wars that continue to rage in the penguin arena. So for this site, no major effort has yet gone into cleaning up the small number that remain.
Results
Okay so what did all of this do in regard to the recovery I mention in the title? You tell me.
And here's just a small fraction of the top phrase ranking changes:
Next steps
While the above charts show quite serious improvements since the implementation was started, there's more work that remains.
Google Ad Scripts continue to be a big problem. Errors at the code level and processing delays abound. It's an ongoing issue many site owners struggle with. Heck—just eliminating Google's own ad server tracking code has given some of my clients as much as one to three seconds overall page processing improvement depending on the number of ad blocks as well as intermittent problems on Googles ad server network.
Except at a certain point, ads are the life-blood of site owners. So that's a pain-point we may not be able to resolve.
Other third party processes come with similar problems. Sometimes third party "solution" providers are helpful to want to improve their offerings, however the typical answer to "your widget is killing my site" is "blah blah blah not our fault blah blah blah" when I know for a fact from countless tests, that it is.
So in this case, the client is doing what they can elsewhere for now. And ultimately, if need be, will abandon at least some of those third parties entirely if they can get a quality replacement.
And content improvements—there's always more to do on that issue.
Bottom line
This is just one site, in one niche market. The work has been and continues to be extensive. It is, however, quite typical of many sites that suffer from a range of issues, not all of which can be pinned to Panda. Yet where ignoring issues you THINK might not be Panda specific is a dangerous game, especially now in 2015, where it's only going to get uglier out there...
So do the right thing for the site owner / your employer / your client / your affiliate site revenue...
Thanks for a really comprehensive round-up of your audit process and for the tip-off about the Panguin tool. I hadn't heard of this before but it's going to be very useful in my own audits from now on.
I started working on a project with similar problems about two years ago (a car insurance website) - they'd had a few great years, despite sailing very close to the wind with their SEO tactics, then things took a sudden nose-dive and things slid gradually downhill to a virtual standstill from there. My own audit, unsurprisingly, diagnosed amongst other things a strong likelihood of having been walloped by algorithm updates.
Anyway, to cut a long story short, we managed to turn things around, but it took a full year of work before we saw any significant upturn. I guess the point I'm making is we were fortunate this customer understood what was involved and had realistic expectations of what we could achieve and the likely timescale, given the rather limited budget we had to work with, and was prepared to invest in a long-term solution. But that's a whole other blog post!
Mandy,
Glad you like the post.
Like all tools, the Panguin tool has its value. And can be a time saver if things do line up.
You bring up a very important point - even after someone who can do the evaluation / audit work is involved, for many sites, even with client buy-in, there may not be the resources to apply enough effort consistently. Sometimes that just means more time is required (like the site you are talking about).
Sadly however, sometimes it's not enough to just keep plugging away on a small scale depending on the site, competitive landscape and limitation of resources.
In those situations, I sometimes end up needing to help the client understand it might be a losing battle. Painful to say, yet needed in those situations.
This is awesome! It's rare that you can pinpoint one definite cause of such a drop. It's best to invest time and resources to ensure that your site is "good" on all fronts. Thanks for the examples.
This was fantastic Alan. Very helpful. And it's exciting to see the traffic increase for this site.
I would love more discussion on the "flat" architecture debate. It seems that the standard, when someone is asking about flat architecture is to point them to Rand's WBF : https://moz.com/blog/whiteboard-friday-flat-site-architecture. But, I feel that what Rand is describing there is not necessarily what some people call "flat" today. He is still advocating having directories and categories, but simply reducing the number of clicks it takes to get to any particular page. I think that a lot of people think that "flat" means that you simply remove any subdirectories and have all flat urls. Or maybe I'm just confused?
As with (seemingly) everything else in SEO, it seems people take a blog post or video and run with it to the extreme. While, in general, flat architecture is better for efficient crawling, there's still a place for three, four, etc. "levels" when it's part of a sales funnel, goal funnel, or something else.
I'm guessing (not speaking for him, :-D) that's where Alan was going; don't just eliminate your funnels and subfolders just to make everything "flat" for the sake of it being "flat." Another reason for not completely eliminating subfolders and deeper levels would be the semantic relationship between the pages' topics.
For the same reason breadcrumbs are valuable and can help reiterate semantic relationships, I'd advocate there's some topical relationships you could reinforce via folder structure, as well. I.e. /category/service/product-associated.
But I'm certainly not an expert in that realm... :) Bill Slawski? Others? Am I crazy?
Marie,
Brady did a good job explaining my reasoning.
Content organization in URLs needs to model the ideal way its organized in site navigation (if site navigation is properly set up). In proper site navigation, it IS a funnel. You don't link to 5,000 pages in your main navigation bar do you? Or wow. You really need to NOT be doing that :-)
Amazing Job Alan. I think that a lot improvement made that noindex, follow part - I have cases when deindexing about 10k and more subpages resulted in amazing increase in overal traffic (mostly think content).
Good Job Alan, good job. ;)
Now this is a great post.This post does not only tell us what to do after the site was penalized, but what to do to avoid penalization.
You're also killed two seo myths.First is "flat architecture", and the second one is: "on page seo is dead".Obviously it's not dead.
Thank you Stelian!
An absolutely awesome job, Alan. I'm embarrassed to say that I had to read it twice to catch up with your thought process, but you nailed it! And thanks for mentioning urivalet... I'd forgotten all about that tool and how handy it is. Keep rockin' the audit world, bud!
Thanks Doc! When Edward popped into Twitter last time (one of his rare brief appearances, he was surprised to learn a number of us still got value from it, and had been about to shut it down. Prevented a disaster!
Hey Alan,
Thanks so much for producing such a detailed and actionable account of all the possibilities. One of the hardest parts of being a support person for people dealing with a specific problem is helping them to understand that they may have a truckload of other problems dragging them down.
My most common advice to support callers is to make sure every other issue on their site has been properly attended to in addition to dealing with resolving unnatural linking issues for manual actions or penguin effects.
It is wonderful to have this as a detailed resource I can now point people to and be confident they are much more likely to grasp the implications of all the other things that can affect visibility.
Love your work :)
Sha
Thanks Sha !
Yeah - educating clients to get them to realize the implications / consequences is so important. We can expend countless hours of our time, experience and energy in producing findings, yet if they don't grasp the seriousness from a bigger picture perspective, it's a waste of everyone's time...
Apologies for a very late response but this is one of my favourite articles on here and I like many others have this permanently bookmarked now.
Ronnie,
The knowledge contained in the post is still valid, so it's not too late :-)
WOW Thanks for all the great comments! Just woke up - jumping on a client call. Will respond when I can!
Thanks Alan for provides me very helpful information.
really its very helpful for me!
Thanks
You're welcome! Glad you found it helpful!
Thank you Alan, I am new to the moz community and everything that I have seen so far have been great, but I almost missed this article which is absolutely outstanding!.
My takeaway from this post is that a good website audit will include:
Algorithm updates correlation analysis
Page Processing & Crawl efficiency analysis
Backlink analysis
Internal linking analysis
Content analysis
UX, Markup and code lookup and
a QUART mindset
It will be great too see another audit case study of yours about how you could be recovering a website by improving UX and specifically throughout implementing responsive design.
Thanks again!
Raul,
That's essentially the boiled down list of broad bucket considerations of an audit as far as the type I perform goes. Each site is unique, so there are often other considerations.
Take for example, a site that has 200 sub-domains associated with it. Or a site that operates in a field where social engagement is vital to the SEO. So sometimes additional off-site evaluations (off of the main site in question) are also included. And that's just two of many other examples.
Responsive design. Ugh. Can't tell you how many responsive design sites I've audited where everyone claimed "it's just the presentation layer, so it doesn't impact SEO" and yet I found critical flaws in its execution that directly harmed SEO. :-)
Such a helpful article for site audits.
Thanks Alan.
Got an email with your original comment.
Glad to hear you found the post valuable! :-)
This article was such a help. I had not heard of the Panguin tool and it will help be a great suppliment to the tools I already use. One thing I really loved about the article is how your provided an example of what you tell clients. I find this is something we continually struggle with. Communicating to a client, who is near panic, that it will take time and money to get back on track is always a challenge.
I've been struggling with meta keywords and explaining how their place in SEO has changed over time. Many clients have a limited knowledge of SEO and they get caught up with anything with the term meta. It is a waste of resources to focus on Meta Keywords instead of improving H1 tags.
Thank you for the article and I will be referencing this in the future.
Tory,
Thank you - it sounds like my goals of helping our community do our work have been achieved then!
Nice post with a very useful point of view. I will definitely be using it to view some sites whether they were hit or not. Thanks
Wow thats a nice recovery! Thanks for the very detailed post. Can Panda only penalize sites? Or by optimizing for Panda is it possible to get a boost in search rankings? I've done some work to improve site speed but don't believe I've been impacted by Panda.
Interesting concept. Can a site see a boost if it gets "Panda" signals right?
First we need to understand Panda is one of many sets of related scoring sets in the greater Google system. If that scoring set determines a site is very weak on quality and trust signals, that site will likely at some point, see loss of rankings. That then, would be considered a Panda "penalty".
Conversely, that same site, if cleaned up properly, should, at some point, see an increase in rankings the next time those Panda specific algorithm factors are evaluated.
So what about a site that hadn't been hit by Panda? Wouldn't it then make sense that such a site should see a boost?
Not really, at least in most situations. For the vast majority of sites, if they're not "penalized" by Panda, it means they are "good enough". (otherwise they would have been hit)
And if a site is good enough to not be penalized, it's not likely to see gains.
There are, however, situations where gains can be had.
First, if the site is in a field where other competitor sites were hit by Panda, that site will naturally rise in rankings, even though technically, it's total ranking score remained the same.
There is at least some anecdotal talk around the web about sites that got an actual boost. So I suppose it is possible that a site could see a boost in its ranking score from a Panda update as an isolated change. However without knowing the full spectrum of a given competitive landscape and whether that boost might have come from other sites falling, it's very difficult to know what caused that boost.
Sure, some data has come out, especially with the bigger Panda rollouts where researchers have shown "winners" and "losers". So it is possible.
I just think however, that for the overwhelming majority of sites out there, that they fit in the "just good enough to have not been penalized" column.
I say that because every site I've ever audited, even ones that have not been hit by Panda or any other major algorithmic penalty, have had at least some flaws in their overall SEO and that translates into being "at least some weakness / vulnerability".
Just my opinion though. Others may actually have done extensive studies on this concept where they were able to isolate sites in ways that could show their gains weren't triggered by competitor losses.
thanks for the information, but got one question. My website is having CSS validation error, it will hurt by anyway?
There is no requirement for a site to pass with 100% validation regarding CSS or HTML. However the more errors you have in either of those on any given page / across the entire site, the more likely search engines may become confused in their understanding of the site's presentation, message and topical focus. This can harm user experience, and search engine quality and trust signals.
Examples explain very well the provision of audit SEO. Thank you for your great article.
Wow! amazing post, direct to my favorites! many thanks for sharing your knowledge, I have detected important mistakes that we need to correct to improve our SEO.
Well this is an EPIC post. And a lot to learn from it. I actually never took a lot of things under consideration but now i learned that each and every things counts.
Definitely going to test this out ... even on a site that has just some ranking dropped but not any penalty
Yes, it's a good practice to do a full evaluation at least once in a while, regardless of whether there has been a clear penalty or not.
Thank you very much :)
You rocked this one, Alan! I especially liked "shiny object services" :-)
This is a really great example of how no one single magic trick leads to success and Lamborghinis for everyone...and that addressing all the little things that all add up really makes a difference.
Thanks also for the reference to Panguin tool. Somehow I hadn't run across that yet. Very useful.
So. Where do I go to pick up my Lamborghini? And what color is it? I really like the lime green ones I've seen (though the one I drove when I was in Vegas 2 years ago was black...)
Is the panguin tool still a nice tool to monitor penalties? I did not find it useful, but maybe it was my mistake.. thanks for your great post full of information! the thing I don't like about panguin tool is you have to grant it access to your data..
Eugenio
I find the Panguin tool is helpful when looking at the historic data. However, like anything else in SEO, it is best to not rely entirely on one tool, resource or signal. Having a broad evaluation allows for a more complete understanding.
Hello Alan,
Your article has covered almost everything for recovering from the penalty but our team at SEO Hunk International has recently worked on to remove penalty and succeeds. In our process we have used link research tools for link detection. We have been listed in their case studies “3 Great Google Penalty Recovery Stories”. https://www.seohunkinternational.com/blog/manual-penalty-revoked-a-case-study/
Wow Alan, great case study. This is the kind of post that makes you audit your own web/blog after read it. I'm gonna review the QUART in my blog just right now thanks to you! hehe.
Rubén
This. This is why I do what I do. :-) Also for more info on QUART, you can read my original article about it over on SEJ at https://bit.ly/quartseo
Hello.
I don't know if i'm in the right place as this is my first comment. I've recently open an electrical supplies business and have created a website. I didn't know how much went it to it ie SEO, content etc to get noticed on google. I would like some pointers or help to get me going,
if im commenting in the wrong place I am sorry
www.adhelectricalsuppliesltd.co.uk. is my website
the domain name was created in march 2013 and I have done research but still cant get my head around everything. I really want to do well on the internet and I know it will take a long time as ive read on the internet.
many thanks andrew
Our Q&A section at https://moz.com/community/q (available with a free trial of Moz Pro) is a great place to ask questions about your site. Also see our free beginner's guides at https://moz.com/learn/seo to get you started.
A really great read, I love a proper real-world example! Thank you for compiling & sharing this.
The only thing I would be add would be right at the outset, just check that you can rely on what you're seeing in Google analytics (or whatever analytics you have). On more than one occasion in my experience, some "clever" Dev has filtered out some bot traffic, or removed some Analytics tags, or something else which corrupts the data we get from Analytics which would really mess up any SEO analysis.
Clearly not the case in this example but I think it's worth bearing in mind. Always question the data you are presented with!
I agree Martin! Its a primary reason I use URIValet, WebPageTest, ScreamingFrog, OpenSiteExplorer, and several other great tools.
Nice post Alan,
I personaaly think that fixing technical SEO problems like fixing crawling & indexing issues, archiving old pages etc sometimes (if not all times) has direct effect on site's overall traffic and ranking
Hello Sir Alan,
Once again you did a complete justice to the topic. You explained everything in a great manner, I just have some questions and hopefully you'll look into them:
Really looking forward for your response.
Thanks,
Umar
Umar,
You're welcome - every post I write is presented with the hope it will help others.
1. GPSI:
50/100 in GPSI. That's tragically low. And its why I double check and triple check those findings with URIValet.com 1.5mbps speed data on a sampling of page template types as it most often (not always) represents a middle-ground speed that Google sees across the volume of actual site users. And then go to WebPageTest.org - they have a grading system of their own, and you can test speeds there.
It's been shown through many case studies that improving speed and crawl efficiency, even if it doesn't improve SEO dramatically, can still improve conversion rates dramatically.
2. Changing Hosts
Changing hosts should not cause problems if that's all thats done. If the new server system is worse then the old one, then yes, that can cause problems. Or if the URL structure changes, that too can.
How much time it takes if there's a loss depends on the extent of the new problems. I've seen sites drop with a move and once the new problems were fixed, begin to return to normalcy within a month or two. And I've seen others take several months to recover because the scale of the new problems were big.
In three cases, I saw rebound happen literally overnight once the problem was resolved.
In each case above, the problems were unique.
3. 2015
QUART - my five super-signals. Apply that to every single aspect of work that's done for a site. It's a very easy concept to remember, and with practice, becomes easier to apply across the board. Google is only going to get more restrictive, so the sooner site owners and managers get on board with quality, uniqueness, authority relevance and trust, the sooner they'll be properly positioned for sustainability.
Thank you for the great answers, I really appreciate it :)
QUART...I like that! I'll start using that at my agency. Great article and I love the details. Agreed with the above poster on Moz Pro. Worth the money!!!
Thanks - yeah QUART just really drives the most important concepts in a no-nonsense way.
Alan, this is a tremendous guide that I've bookmarked for future reference whenever I need to address any clients' penalty issues. So, thanks!
I just had one question on this:
Eliminate multiple category assignments for blog articles
More duplicate content issues. Sometimes you can keep these, however if multiple category assignments get out of hand, it really IS a duplicate content problem. So in this case, we resolved that.
This might be an "optimize for search engines versus optimize for users" issue. While we all know why duplicate content is bad from a search-engine standpoint, are there not legitimate times to place content in multiple categories from a UX perspective? Like, if I write a post that deals extensively with both, say, social media and public relations, then would not people interested in either topic want to be able to find it in both categories?
On my personal site, for example, I no-index all category archive pages after the first page -- to me, this eliminates duplicate-content issues while maintaining the UX as described and keeping the (indexed) category pages as original as possible.
Of course, I might be missing something. Alan (or anyone!), I'd love to get further thoughts on this when you have time. For example, the Moz content team might have some observations -- since Moz posts can be assigned to multiple categories, they might have some data on why this is (or maybe is not) a good practice.
Samuel,
As is the case with most SEO, "it depends".
if you think "this really does relate to both these categories, yes you can assign to both. Except what if 90% of your posts fit that? That's 90% duplication. And Users to a certain degree WILL go WTF? Why do I keep seeing the same posts over and over again?
THAT is why it's bad for SEO. Search engines do NOT want to annoy / frustrate searchers. So its not JUST for search engines. It's for search engines BECAUSE its for users.
Hi Alan it's amazing indeed!
After you have done the changes in the site speed, you saw immediately improvement in the loading time?
Yes the speed improvements came as soon as the work was completed on that group of issues.
One last question - was it a one-time-paid task or consulting per working hours? Just wonder how people manage to work with clients on this kind of heavy cases.
Łukasz
The initial audit was a one-time paid project. I do, however, offer all my audit clients additional consulting during the implementation phase. If they need me extensively in that consulting beyond a couple hours, its at an hourly rate. Otherwise I just offer the occasional support at no additional charge since they pay me very good money for the audit and action plan effort.
Great! Thanks for answer! And once again - amazing job, would like to read more/share knowledge about some heavy cases when it comes to recovery. ;)
Thanks a lot Alan! I really have't done a full on audit in a while (for our in house website). It is great but also hectic that whenever I jump on the Moz blog, I always find a million things that I need to be doing better.
Good work on recovering that site. I find that, especially with an in house website team, one that doesn't care as much about SEO factors, the website gets messed up from a technical standpoint over time. Regular audits is like a deep clean that needs to happen regularly to make sure everything is operating at its fullest potential.
Yes, over time, the technical problems can grow exponentially and I've seen that ruin sites too many times!
One of the major concerns with most of the sites that I have analyzed is 'Over optimization', people just tend to over optimize their websites with same phrases many times just to gain advantage over their competitors which in reality is big turn down for their websites.
Salman,
Yes, I have seen many of those as well. Over optimization is a major problem.
This is great post. And even you clean the site it is important to make good OnSiteSEO to increase traffic/sales/visits (etc...) :)
Thanks Alan for this information.
Thanks for sharing the article. It was great going through it and have learnt many new things.
I am confused about UX. What is that. Please check the clipping path keyword and see where www.clippingpathspecialist.com . Please look and feedback me how can I get the 1st position in the clipping path keyword?
Atiqur
UX is "User Experience". Search engines attempt to emulate user experience when they evaluate whether a page, section or site deserves to rank for a phrase the algorithms assess as most relevant for intent.
Regarding your site: I can't assume from a quick look what might be right or wrong for any single site. There are hundreds of factors.
A quick check shows you are ranking low on the 1st page of Google for "clipping path service". So you have a good foundation of SEO.
Where weakness might be could be any number of issues. Since you only have one page for each service, if your site is competing against sites that have multiple pages for each of your primary service phrases, that's a consideration.
Other issues might be quality related. Your main clipping path page fails Googles Page Speed Insights test for mobile users (55 out of 100 points) and barely passes for desktop users (78 out of 100 points).
Your template is broken though. When I did a test just now on that page with URIValet.com it revealed a "403" server error on almost every single image in your site template.
And every single link within that page resulted in a 403 server error in my test as well.
So something is critically flawed there.
And there could be many other issues as well. That's the first few I found though in a quick check.
You might also want to get a trial of Moz Pro if you haven't before, and check out our Q&A section at https://moz.com/community/q/. You'll be able to ask specific questions there and get feedback from the community.