It’s a story we hear too often: someone hires a bad SEO, that SEO builds a bunch of spammy links, he/she cashes their check, and then bam – penalty! Whether you got bad advice, “your friend” built those links, or you’ve got the guts to admit you did it yourself, undoing the damage isn’t easy. If you’ve sincerely repented, I’d like to offer you 6 ways to recover and hopefully get back on Google’s Nice list in time for the holidays.
This is a diagram of a theoretical situation that I’ll use throughout the post. Here’s a page that has tipped the balance and has too many bad (B) links - of course, each (B) and (G) could represent 100s or 1000s of links, and the 50/50 split is just for the visual:
Be Sure It’s Links
Before you do anything radical (one of these solutions is last-ditch), make sure it’s bad links that got you into trouble. Separating out a link-based penalty from a devaluation, technical issue, Panda “penalty”, etc. isn’t easy. I created a 10 minute audit a while back, but that’s only the tip of the iceberg. In most cases, Google will only devalue bad links, essentially turning down the volume knob on their ability to pass link-juice. Here are some other potential culprits:
- You’ve got severe down-time or latency issues.
- You’re blocking your site (Robots.txt, Meta Robots, etc.).
- You’ve set up bad canonicals or redirects.
- Your site has massive duplicate content.
- You’ve been hacked or hit with malware.
Diagnosing these issues is beyond the scope of this post, but just make sure the links are the problem before you start taking a machete to your site. Let’s assume you’ve done your homework, though, and you know you’ve got link problems…
1. Wait It Out
In some cases, you could just wait it out. Let’s say, for example, that someone launched an SQL injection attack on multiple sites, pointing 1000s of spammy links at you. In many cases, those links will be quickly removed by webmasters, and/or Google will spot the problem. If it’s obvious the links aren’t your fault, Google will often resolve it (if not, see #5).
Even if the links are your responsibility (whether you built them or hired someone who did), links tend to devalue over time. If the problem isn’t too severe and if the penalty is algorithmic, a small percentage of bad links falling off the link graph could tip the balance back in your favor:
That’s not to say that old links have no power, but just that low-value links naturally fall off the link-graph over time. For example, if someone builds a ton of spammy blog comment links to your site, those blog posts will eventually be archived and may even drop out of the index. That cuts both ways – if those links are harming you, their ability to harm will fade over time, too.
2. Cut the Links
Unfortunately, you can’t usually afford to wait. So, why not just remove the bad links?
Well, that’s the obvious solution, but there are two major, practical issues:
(a) What if you can’t?
This is the usual problem. In many cases, you won’t have control over the sites in question or won’t have login credentials (because your SEO didn’t give them to you). You could contact the webmasters, but if you’re talking about 100s of bad links, that’s just not practical. The kind of site that’s easy to spam isn’t typically the kind of site that’s going to hand remove a link, either.
(b) Which links do you cut?
If you thought (a) was annoying, there’s an even bigger problem. What if some of those bad links are actually helping you? Google penalizes links based on patterns, in most cases, and it’s the behavior as a whole that got you into trouble. That doesn’t mean that every spammy link is hurting you. Unfortunately, separating the bad from the merely suspicious is incredibly tough.
For the rest of this post, let’s assume that you’re primarily dealing with (a) – you have a pretty good idea which links are the worst offenders, but you just can’t get access to remove them. Sadly, there’s no way to surgically remove the link from the receiving end (this is actually a bit of an obsession of mine), but you do have a couple of options.
3. Cut the Page
If the links are all (or mostly) targeted at deep, low-value pages, you could pull a disappearing act:
In most cases, you’ll need to remove the page completely (and return a 404). This can neuter the links at the target. In some cases, if the penalty isn’t too severe, you may be able to 301-redirect the page to another, relevant page and shake the bad links loose.
If all of your bad links are hitting a deep page, count yourself lucky. In most cases, the majority of bad links are targeted at a site's home-page (like the majority of any links), so the situation gets a bit uglier.
4. Build Good Links
In some sense, this is the active version of #2. Instead of waiting for bad links to fade, build up more good links to tip the balance back in your favor:
By “good”, I mean relevant, high-authority links – if your link profile is borderline, focus on quality over quantity for a while. Rand has a great post on link valuation that I highly recommend - it’s not nearly as simple as we sometimes try to make it.
This approach is for cases where you may be on the border of a penalty or the penalty isn’t very severe. Fair warning: it will take time. If you can’t afford that time, have been hit hard, or suspect a manual penalty, you may have to resort to one of the next two options…
5. Appeal to Google
If you’ve done your best to address the bad links, but either hit a wall or don’t see your rankings improve, you may have to appeal to Google directly. Specifically, this means filing a reconsideration request through Google Webmaster Tools. Rhea at Outspoken had an excellent post recently on how to file for reconsideration, but a couple of key points:
- Be honest, specific and detailed.
- Show that you’ve made an effort.
- Act like you mean it (better yet: mean it).
If Google determines that your situation is relevant for reconsideration (a process which is probably semi-automated), then it’s going to fall into the hands of a Google employee. They have to review 1000s of these requests, so if you rant, provide no details, or don’t do your homework, they’ll toss your request and move on. No matter how wronged you may feel, suck it up and play nice.
6. Find a New Home
If all else fails, and you’ve really burned your home to the ground and salted the earth around it, you may have to move:
Of course, you could just buy a new domain, move the site, and start over, but then you’ll lose all of your inbound links and off-page ranking factors, at least until you can rebuild some of them. The other option is to 301-redirect to a new domain. It’s not risk-free, but in many cases a site-to-site redirect does seem to neuter bad links. Of course, it will very likely also devalue some of your good links.
I’d recommend the 301-redirect if the bad links are old and spammy. In other words, if you engaged in low-value tactics in the past but have moved on, a 301 to a new domain may very well lift the penalty. If you’ve got a ton of paid links or you’ve obviously built an active link farm (that’s still in play), you may find the penalty comes back and all your efforts were pointless.
A Modest Proposal
I’d like to end this by making a suggestion to Google. Sometimes, people inherit a bad situation (like a former SEO’s black-hat tactics) or are targeted with bad links maliciously. Currently, there is no mechanism to remove a link from the target side. If you point a link at me, I can’t say: “No, I don’t want it.” Search engines understand this and adjust for it to a point, but I really believe that there should be an equivalent of nofollow for the receiving end of a link.
Of course, a link-based attribute is impossible from the receiving end, and a page-based directive (like Meta Robots) is probably impractical. My proposal is to create a new Robots.txt directive called “Disconnect”. I imagine it looking something like this:
Essentially, this would tell search engines to block any links to the target site coming from “www.badsite.com” and not consider them as part of the link-graph. I’d also recommend a wild-card version to cover all sub-domains:
Is this computationally possible, given the way Google and Bing process the link-graph? I honestly don’t know. I believe, though, that the Robots.txt level would probably be the easiest to implement and would cover most cases I’ve encountered.
While I recognize that Google and Bing treat bad links with wide latitude and recognize that site owners can’t fully control incoming links, I’ve seen too many cases at this point of people who have been harmed by links they don’t have control over (sometimes, through no fault of their own). If links are going to continue to be the primary currency of ranking (and that is debatable), then I think it’s time the search engines gave us a way to cut links from both ends.
Update (December 15th)
From the comments, I wanted to clarify a couple of things regarding the "Disconnect" directive. First off, this is NOT an existing Robots.txt option. This is just my suggestion (apparently, a few people got the wrong idea). Second, I really did intend this as more of a platform for discussion. I don't believe Google or Bing are likely to support the change.
One common argument in the comments was that adding a "Disconnect" option would allow black-hats to game the system by placing risky links, knowing they could be easily cut. While this is a good point, theoretically, I don't think it's a big practical concern. The reality is that black-hats can already do this. It's easy to create paid links, link farms, etc. that you control, and then cut them if you run into trouble. Some SEO firms have even built up spammy links to get a short-term boost, and then cut them before Google catches on (I think that was part of the JC Penney scheme, actually).
Almost by definition, the "Disconnect" directive (or any similar tool) would be more for people who can't control the links. In some cases, these may be malicious links, but most of the time, it would be links that other people created on their behalf that they no longer have control over.
Most of the time the penalty is not from bad links. It's too easy to create link spam for your competitor, Google can't really take these situations seriously. The only "bad links" I know of are spammy links from other web properties that Google has determined you are associated with via some form of ownership. If the whois contact matches, or the IP is on the same block, or it detects a chain of reciprocials, etc.
In many cases, I agree - those are certainly the easiest to spot. Unfortunately, I've seen a handful of cases this year where I'm 98% sure that malicious links were causing serious problems. It's hard to ever be 100% sure (you could say this about most SEO issues), but we were able to rule out virtually all other signs, and the pattern was very aggressive.
Could you elaborate more on how you determined to 98% certainty the penalty was links and give examples of the links. I'd like to follow the same pattern to sabotage all my competitors' SERPS. :)
how to stop bad link injection by someone to our site,
how to rank site particular page with single phrase keyword
how to create good external link
I'd have to agree with you there Daniel. Since the webmaster cannot have 100% control over the inbound links its unfair and not logical for Google to penalize the site as a whole. Image a world where you could buy 1000s of spammy links and knock your competitor from the listings. The most logical thing is to devalue the links so that the incentive to build them is 0.
After doing a large redesign and adding lots of pages and content to my site I noticed I was no longer performing in the engines. Infact only my home page for my branded keywords displayed. After starting my SEO campaign I realized that wordpress was displaying the pages correctly but returning 404 errors to the search spiders... So users like me saw a functional page and Google saw a home page linking to 30-40 404 pages.
I'd definitely check other avenues first that are much easier and concrete to spot before jumping to the conclusion of bad links.
I agree, it would be too easy for competitors to do this. I believe Dr. Pete that some bad links can be causing problems, but overall its seems like technical issues and any deceiving on page tactics that you perform on your site like cloaking, stuffing, hiding, etc are the only worthy and fair metrics to penalize a site by.
Good post Dr. Pete! but
Some of your points are beyond to my scope of thinking. Like
1. Finding a Bad links it’s difficult to find one!
2. Removing the bad links how we can do this it seems impossible to me! We can’t remove our links on other sites.
3. We don’t have control of links pointing to our sites, some po*n site link back on my site what can I do? Why Google Panelized me on this?
4. Building a good links does not remove the bad links and as you say bad links takes time to devalue. This can’t improve my ranking it take lot of time to overcome.
5. What if someone whom I did not hire makes spammy links for my site I can’t do anything in this manner.
1. It will take some work but not all that difficult. Links can be found in webmaster tool or backlink checker like the SEO MOZ pro here and the work is involved in going through the list and singling out the bad links by looking at the site/page they are coming from.
2. Pete suggested couple of way - requesting the webmaster, or removing the page. But I agree it is mostly a daunting task
3. I agree with you. Google should not but it probably does in some cases when there is a huge pattern
4. Good links can eventually give your site enough authority so they outweigh the bad links thoroughly
5. It's basically the same question as no. 3
1. I mean to say finding a back link is not so difficult but figuring out which of them is a bad is difficult task. Because no certain rules are defined at least I haven’t any.
2. “Requesting the webmaster” it’s not a practical way, if there is to many site’s are involved secondly webmaster of such site will not removed sometime because they are paid by third party to add our site link.
3. I am not exactly right but whenever google find some pattern it will panelized. May be I am wrong.
4. Yes they do but what if someone continually adding bad links to decrease my worth!
This whole scenario is about reveres SEO because I don’t think any webmaster add spam links towards his or his client site.
Finding the PA of the website from where you are getting links, Athority of Domain from where the link is coming from can be a good way to find bad links.
If you Play around with links and do research with links i don’t see it as difficult to find bad links in a link profile but yes it will be frustrating like anything!
Requesting a website is usually not a practical approach but it really depends upon case to case like i know a company who build all junk links so they had all the info, it was easy for them to go ahead and remove them (easy if you have in-house SEO/link builder).
Well as far as devaluing of bad link just because it’s not built by me... well it sounds legit but i don’t think Google devalue bad links all the time sometime people face penalty even when they didn’t built junk!
Dr. Pete I totally agree to the fact that there should be some way by which we can disconnect the unwanted links which may have got created due to various reasons.
In fact I had asked this question to Vanessa Fox at SMX East in September 2011 and she said that as of now it is not possible.
We had such an experience when one of our sites got hacked with the link spam injection malware and thousands were links were built in just a few days and were reflected in the GWT and Google took a manual action of removing it from the index.
We had explained the whole experience in detail on
https://www.alrayeswebsolutions.com/blog/seo/solution-to-the-link-spam-injection-hacking-attack-and-reconsideration-requests-to-google/
Of course with the reconsideration request the site got included back and all the links were also removed from the GWT.
But now the same site has again got hacked and again the links are being reflected in the GWT but the site is not banned so disconnecting these links is a big issue for us as the reconsideration request will work only if Google takes some mamual action of removing the site from the index.
Atleast for such situations there should be some way by which we can eliminate the adverse effect of the unwanted spammy links to the site.
Bharati, Thanks for taking the time to comment and to share the link about your client reconsideration request. I found both very helpful. :-)
To be honest i dont think there is such a thing as good or bad links. There are links that help you and links that dont help you.
Its possible to trip a bunch of filters by doing some things stupid, but then, if someone can point 10k spammy links at your site and get you sandboxed or trip a filter, its your own fault because you havent built enough good links.
Point a billion shitty links at SEOmoz, whats gonna happen? Nothing.
I continiously see sites with mostly spammy links rank high. So it isnt the "bad links" that are hurting them. On the contary, these bad links frequently help. The problem is when someone isnt knowledgeable and does something stupid.
Unfortunately, I've seen situations where malicious link-building was able to bring down a site. It does take real effort, and I agree that it very rarely works on sites with a solid, established link profile. Over the past couple of years in Private Q&A, though (where I might review 5-10 sites/day), I'm seeing it more often than I previously believed as a consultant.
Absolutely agreed that many bad links do actually help ranking. That was part of my point in the post about why cutting links is so difficult. You never know which bad links are really causing harm. Most of this applies to the really extreme cases, and most of the time it's links that were either built by the site owner or someone they paid. The most common scenario is when they hired someone and didn't realize what that someone was doing, tactically.
Sorry for the late (ish) comment. I just wrote a post on a similar thing (how we identified and rectified malicious links) and someone suggested i check your post out as well.
I have seen a few people suggest 301 to a new page and your rankings will return, something we couldn't try.
I would say asking for links to be removed, even if there a lot, is worthwhile. We managed to get rid of a lot. With a little bit of a play with Google Docs we built a sheet to automate getting stuff like the whois contact information and mail merge it didn't take too long.
Some nice tips though.
Matt
Good stuff, thanks - it's nice to see more public stories like that (as I haven't been able to share some of the examples I've seen this year, due to confidentiality). While bad links don't usually harm you, people have started reading "don't usually" as "never", and that's just not true in my experience.
This is a good post, recovering from bad SEO is a major headache for many of us. It's not just the damage that bad SEOs do to individual clients, but the damage they do to the reputation of the industry as a whole.
I think there's some important distinctions to make here though.
There's not really any such thing as a bad link as such, just worthless links.
A "bad" link is either:
a) Hopelessly off-topic - links from porn sites aren't inherently bad, just irrelevant (unless you're a porn site too)
b) Identified by Google as "spammy" and therefore worthless
The penalty that people perceive is the result of thousands of links suddently being ignored, and any beneficial effect they were having suddenly disappears, causing a drop in rankings which looks like a penalty.
There's no point in getting "bad" links removed unless they're on sites that are easily linked back to you. At worst they are worthless, and there is a risk that you'll remove some good links too by mistake.
It's often claimed by SEOs that they've seen genuine "bad link penalties" first hand, but press them for evidence of it, and you won't get any. Google can't let this happen because you'd be able to sink competitors by buying a load of spammy links.
Unfortunately, I have seen situations where I'm 98% sure that malicious link-building caused a penalty. I won't say I'm ever 100% sure, but that's true of almost any SEO situation. In these cases, the patterns were aggressive and fairly intelligent. A few links from porn sites won't do it - I absolutely agree that Google has to set the bar very high on this.
The evidence aspect is tough, not because most SEOs aren't telling you the truth, but because when we know a case that well, it's almost always a client, and we're bound by confidentiality. I know of a clear case here in Private Q&A, but, of course, I can't reveal the details. That's one of the more difficult aspects of advanced SEO.
I will agree that, many times, when someone is sure they've been penalized by a competitor's activity, something else is really going on. In some cases, they were already on the fence, and it's possible the competitor pushed them over the edge. In those cases, though, they had already given Google a lot of bad signals.
I wonder if Google has tools that tell them whether link building was probably/definitely done by someone associated with the website, and these would be the links to get a penalty. Otherwise, how could they possibly justify penalising malicious links when anyone can build them.
I found out that my free vector resource is uploaded on porn site and it´s pointing to my website. That fools created collection of free vector freebies on porn site. Do I have to disallow this link in webmaster tools?
Is there any evidence that Google penalize these bad links instead of ignoring them?
Excellent post Dr Pete.
The "be sure it's links" section is golden, a lot of webmasters are very quick to point the blame at external factors than accept their sites shortfalls (weak content, poor design, clumsy UX).
I think point 4 is the only real productive solution. Chasing your tail trying to figure out why Google doesn't like your link portfolio is never going to drive a website forward. Instead, if Webmasters think about the long game and continue pushing for highly authoritive and relevant natural links, then they are only ever going to succeed. However, I appreciate the argument that some websites simply can't afford to wait for these things to iron themselves out.
I love the proposal too. As an added benefit for Google/Bing, this feature would provide a really strong spam signal direct from webmasters (i.e if a large volume of sites are including 'disconnect: www.badsite.com' then it would be a pretty clear signal that badsite.com is dangerous territory)
As you said, we should move to a new home if all tricks doesn't work. I have a doubt, as we know Google pass all page rank, ranking and link juice to new domain. Won't Google pass badlink issue to new domain? I think Google will pass all badlinks to new domain and it will become worthless.
I think disconnect: www.xyz.com could be better idea but as you also said, Google will never support us....:(
Just wanted to say the first paragraph with "your friend" (in quotes) made me smile. Don't forget about "I have a client" too!
I will have to disagree with the Robots protocol because someone could hack your robots.txt and disavow your authority links. I would support something through WMT instead.
Although that's true, someone could also hack your Robots.txt and deindex your entire site with the current options. I'm not sure that's really an argument against it. If someone can hack you, they can do a ton of damage, with or without this change.
If they disallow your site, you will notice that right away but if they disavow links you might not notice for weeks.
While I don't think hacking should be a deciding factor, that's a valid point. It could go unnoticed.
For all those struggling with 'how to remove bad links':
On Twitter a few weeks ago @simonpenson suggested contacting hosting ISPs directly and telling them that they are hosting a site that is breaching your copyright (by linking to your site). Would suggest always trying to contact webmasters first, but if that fails then try the level stance.
I like your Robots proposal but agree with Ted that it could lead to link sculpting. It could even incentivise black hat techniques if site owners know that they can just disallow bad links later on if Google 'catches' them.
The proposal is a great one but I suspect Google would never do it because it could enable a sort of external link sculpting.
For instance, Google used to have verbage on their website a few years back that Quality Score in Adwords is partially determined by what Google thinks your site is about (i.e. how relevant your site is to the keywords you're bidding on). They since changed the reference to say "and other relevance factors".
I've seen some evidence with a client recently that this is still the case, and it looks to me like Google bases what the site is about on the anchor text profile, and *not* the on-site content, for mature sites.
If this standard were implemented, people could carve out all the links that say "click here" or just whatever's slightly less relevant anchor text and really sculpt what Google thinks the site is about, and really drive their Adwords Quality Scores for their top keywords through the roof.
So I think they would never do it, unfortunately. Neat idea though!
The problem with a disconnect directive is that it would give impunity to the spammy link builders - you'd be free to build as many spammy links as you liked, getting the benefit from those that slipped through the net and safe in the knowledge that you have a get out of jail free card if you get penalised.
I think the engines just have to get better at determining what is and isn't a spammy link.
There certainly would be some potential for manipulation. Practically, I don't think you'd see a lot - building a link to see if you got penalized, waiting for a penalty, and then cutting it isn't a very efficient black-hat tactic (and still carries risk). In many cases, people who use these tactics can already control the links. If I buy a link, then realize I got a penalty, I can have it removed. A couple of major brands have gotten in trouble for this - they built a ton of low-quality links right before a critical period (like a holiday), got the short-term ranking boost, then cut them right after the holiday to avoid a penalty.
By definition, this directive would apply mostly to the people who don't have control over the bad links pointing to them. The real black-hats can already add and remove links at their leisure.
I see your point - but even with paid links it would make it a lot easier to manage the process of cutting the links, negating the need to contact each site owner.
I think that Google would argue that the goals they're working towards for identifying unethical practice would make such a directive redundant. Whether they'll ever get there is another story.
Thanks for the reply and for a great post.
One of the challenges people will face is determining which links are bad and which ones are good if you have more than a thousand choices.
I know that it is a bit old post, and sorry for getting to this so late. Anyways I do have a question related to "I’d recommend the 301-redirect if the bad links are old and spammy. In other words, if you engaged in low-value tactics in the past but have moved on, a 301 to a new domain may very well lift the penalty. If you’ve got a ton of paid links or you’ve obviously built an active link farm (that’s still in play), you may find the penalty comes back and all your efforts were pointless."
Have you seen this live? As 301 passes the link juice (99% :) ), I was wondering if really will be a penalty lifting for a new domain if the old domain has a spammy inbound link structure.
I think that recovering from bad link is almost impossible. I liked the strategy of building good links is good. Because your good links will might overcome bad links.
I have seen some shocking link building going on by specific companies recently, in some cases I have even seen companies doing link building on adult websites.
The way I have usually taken down these links is to contact the people who run these sites and explain our situation, 90% of the time the links are removed.
But great examples DR Pete all can be utalised, but in my opinion going direct to contact sites where bad links have been built can work well too. Yet depending on the degree of links built.
I don’t think it is the practical way. We can’t contact them manually. It takes lot of effort and resources which can be used for other propose instead of doing such stuff. Many Po*n site and adult site doesn’t display their contact detail so finding there contact detail is another problem.
I really like the robots.txt suggestion - perhaps Google & the other engines can collectively support a format that isn't too difficult to implement & update on a regular basis.
Great post!
What the... I mean EPIC sir! Sometimes you dance around when what you think, actually come on the paper especially when it’s coming from the known professionals like you!
Can’t agree more with ‘Disconnect: *.badlinks.com’
As far as the ways to remove the bad links... in my personal experience the easiest and quickest way for almost all the time is to build some quick Good Links... I mean you can always offer a discount (if ecommerce) or any other form of campaign (depends upon the nature of your business) and market it properly to get more links.
Removing bad links is possible but it is really frustrating and even then you cannot clean your link profile just like that!
Thanks for sharing this SEO method.
As noted by others, bad links are not penalized. Google has officially stated that bad links are not penalized as such. If that were true, then this would be abused as a weapon against competing websites. Rand's Whiteboard Friday last week touched upon this as well.
Even so, your tips on how to cope with bad links are insightful.
option 6 doesnt work... sure for a month or so its ok, but after that the penalty transfers.
my advice is take down all your links incrementally instead of waiting it out.
start with any that are on home pages!
I think get new idia but i will try it
Really cool post. Thanks
Hi there,
I'm looking a way to invite Google to reconsider my website https://www.youtubeviewsbuy.com/
It has been hacked and there in every page was injected a lot of hidden links to unknown sites.
So my website loose several positions in the Google serps.
How to communicate with Google since I haven't got a manual action?
(I don't see any link to their reconsideration request page)
Thanks
Andrew
Hi Andrew!
This is actually a really good question for our Q&A forum. There are a number of experts in there who I'm sure would be more than happy to help! Honestly, you're not likely to get a response to a comment on a years-old blog post. :)
Where this question is concerned, though—if there's no manual action, then there's really no reason to file a reconsideration request. It's most likely that you lost position due to natural fluctuation of the SERPs, which is automated.
Bad links, what a horrible problem to have. Because you really have no way of fixing it.
Its like an arms race. If a competitor decides to go Google bowling on your site. To remedy, build more good links in order to have a good porportion compared to bad links.
Not a great solution. There will always be people who try to game the system for the quick buck. Either robots.txt, nofollow, noindex, whichever tactic.
I fully agree with your opinion Peter ! Well now a lot of tools are available to better diagnose the exact problem with the website and tools are also available to get the detail of backlinks but also how to remove them.Recently Matt Cutts introduced a tool disavow to again stabilize the website in the Google search.
Good post Pete, you describe some good way to fix a site that got hurt in rankings by links (be intentional or not) but I still do think that the "bad" links is still way to much a blurry concept.
one of my clients got 1700 link in one day, it must be any competitor who did this. after he has lost some ranking.
Interesting article, but I wonder whether bad links can have that much influence unless there is a real massive influx of such bad links. Even so, I suspect that the best that Google could so in such a case is simply ignore them, or else risk that the black-SEO guys start bringing down all the competition...
But the idea of a specific entry in the robots.txt file it interesting, despite some of the drawbacks that others have pointed out. After all, don't we already have the possibility to create "nofollow" links in our pages? This would be a "reverse nofollow" or "noentry"...
In any case, there is one thing that I have not seen in your proposals: why not use a "mod rewrite" when getting a referrer link from such sites and redirect it so as to generate a 404? The bad link would never materialize on your site...
How to find that a back link is bad?
Unfortunately, you can't block links by referrer, because spiders don't crawl links in quite the typical way. Google stores the link information as part of the link-graph but doesn't necessarily travel those links from site to site like a human would. They also don't pass referring data.
Let me find a 3rd party overseas service to create poison links that I can point to my competitors, they'll be buried in link analysis and waiting on replies that never come from webmasters of spammy directories/websites whilst I get another 1000 sub-standard links pointed to their website.
Or
Give us the means to "Disconnect", "Disassociated" or "Discount" bad links on a domain basis so Google know we're washing our hands of them. This still involves work on our side of the fence but at least empowers us with the ability to manage it. Conversely it would also inform Google who we are happy to have linking to us.
When you're defining the rules of the game it makes sense to make sure people that turn to the dark side are the ones that have to do the most work not the other way around and this is what I resent most.
Is there any solid evidence (not anecdote) that Google actually penalizes you for having "bad links" to your website?
One our of competitors has around 2.800.000 inbound links, most of them (> 70%) are "spammy" (indeed, most of the websites that link them are link farms) and they rank #1 for many keywords.
just report them
Hi, thanks for this article. One of my websites, a garden office company Extrarooms, recently dropped in ranking in the UK, from 1, to 2, 3 and now 7, over the last few months. We inherited it about 4 years ago, and it had a big number of bought links.
We created a link section, for reciprocal linking, and now try to focus on solely garden or building related links. One site approached me, offering garden tools.
I agreed to a link exchange, but it seems that their site is actually a suit of sites, all with similar content. These sites amount to about 22,000 links into our homepage. I'm assuming that's bad news.
Typical site for them is: https://www.harvestknife.com... with loads of links to the same content in different URLs, with masses of pages and our link on each.
I know our site needs better content, and we are fixing that, but I am worried that these links might end up damaging us.
Any thoughts would be appreciated.
Thanks
Scott
If that's a large enough part of your link profile then yes, I'd say it could put you at very real risk. Google has been pretty tough on link networks this year. Whether to cut them or not is a tough choice, and I think you'd need someone to take a deep dive and really see the big picture, but I wouldn't push those kind of links too hard.
Thanks for your thoughts Dr. Pete.
Regards
Scott
But what happens when I am asked by an SEO company to add a post which links to a major brand? The SEO company benefit and the brand benefits but how does my website benefit? Can you help with this question?
Thanks for the article, definitely a help! I like the idea of the 'Disconnect' option for the search engines, but I'd rather see a functionality like this be within Webmaster Central, than on the open book robots.txt file.
You are absolutely correct Mr. Pete. Site owner can not control over the inbound links. Google should provide an option in GWT or an application to handle such horrific situation.
I tried to block my site backlinks from some adult sites using Disconnect: in robots.txt. Unfortunately this syntax not working (syntax error showing in webmaster). So is this syntax is correct?
its only a suggestion..It has not accepted yet so don't try to implement it.
@Dr.Pete
I come to know about this blog post by your recent blog post on Penguins, Pandas, and Panic at the Zoo! I have very good experience with example number 6. Honestly, It's working without any issue towards performance and ranking.
But, I have very critical issue after implementation. I bought new home but, I have to hang Sign board with my old address. :)
I want to give live example with Google search result. I have removed my old website and set 301 to new category page due duplication issue. Honestly, Who will like to mesh up performance in panda update?
There are 56,311 links & 551 source domains pointing to my old home page. Just see this one!
I'm creating quality links to my new category page where 301 redirect set from old home page to category page.
I have big mind bubble about display of URL in search result. Why Google shows old home page URL in search result? Does it require to create 56,311 + 1 links or 551 + 1 source domains to get appear my new address?
Dr.Pete, I always follow your instructions on my website and always getting positive results. This time I'm waiting for your prompt reply.
I don't currently see either domain popping up in that SERP. Keep in mind that it can take time for Google to process/honor 301s and if you did a mass 301, it sometimes gets messier. If you'd like to submit this as a Q&A, we'd be happy to take a deeper look.
I had think twice before post my question in this comment. I have decided to raise my question in Q & A section. But, I realize that, it's right platform to add my comment after reading entire blog post. Here we can go... https://www.seomoz.org/q/how-to-recover-from-bad-links
Hi guys, I have provided a disavow to google for the whole domain or do i have to specify each and every url? there is over 32,000 from this one domain and it just hammered my rankings. If I block the whole website in Cpanel will this help? google doesn't seem to be responding to my disavow request. Thank you for any help at all.
finally google has provided Disavow Links option in GWT.... Congratulation
HI There,
I need to know it's rightway to do this "disconnect" ?? Can you give any example or explain us more about this with some snaps and etc.,
If it's work then it's awasome things..
Thanks for the informations..
Regards
Google should have had an option in Webmaster Tools that allowed owners to disown links so as to avoid negative SEO / bad links. That would have saved a lot of headache for victims of 'bad links'
Seems to be that "Disconnect" would be pretty exploitable. You could build upp a bunch of trashy links, ride the short-term benefit, then cut them off before you face Google's wrath.
IMO your time is best spent building quailty links back to your site, over time the bad links will be forgotten and the new quality links will offer benefit.
The robots idea sounds like the only 'safe' way forward!
Some nice details and advice, but I dont think many people get penalties from bad links because it would be far too easy to do it to your competitors! I think in general google would simple devalue the bad links unless they are sure you have built them, for example JC Penny and its paid links.
When people talk about "bad links" I think a lot of people presume you simple mean low quality links, but those do little to no damage to your link profile (they just dont help, not do bad for you). When I think of bad links I think of links which google can see you have built to manipulate rankings (paid, link farms and so on) and they dont always have to be low quality links to be bad links, again just look at some of the links JC Penny got which ended up being bad for them, not all those paid links where of low quality!
Anyways, thats my little rant over :)
I love the idea of the “Disconnect” in the robots files but dont you thing that would get abused by spammers? For example, you could built say 1000 spam links which work for a week, and when you notice google starts to put them down you could simple “Disconnect” them all.
Not saying it sounds like a bad idea, in fact you have my vote for it, but I think google would have problems with spammers abusing it and thats probably why we dont have such an option :(
See some of my comments above, but it's true there's some potential for abuse. I think, though, that it's minimal, for one reason - black-hats already do exactly what you describe. Some sites build up 1000s of spammy links to rank for a short burst, then remove them before a penalty kicks in. As long as you control the links, that's easy enough. The Disconnect directive would really be tailored for people who can't control those links - by definition, they're more likely to be victims than perpetrators.
Great post, these suggestions all make a lot of sense. I had never even thought of being able to take care of bad links from the other end (the disconnect idea) I guess it was just outside of my box of thinking. It would definitely be nice! I hope Google does it someday! Maybe Bing should do it and then they can return better results and tip the search scale in their favor.
Thanks
Thanks Dr. pete for making me hold my breath. I found it much worthy for my client's website! The instructions which you told me was much helpful for getting rid of bad links at some extent. Honestly,
I have a request and please ..please give me a reply for it.
My client's website is being processed through some Negative SEO persons from the competitor side(I suspect) because it has got some bad links from spammy sites and Bad fake reviews and complaints from complaint board forum sites.
What should I do in order to completely make the website's reputation neat and clean?
Honestly I say , I have already applied some of Seomoz's best posts(e.g. https://www.seomoz.org/blog/our-online-reputation-management-playbook)strategies but still didn't get complete gain over the negative SERP and bad links, the negative reviews still show up!
So I want to get the links drawn out of the Bad blogger's blog and fake complaint forums threads! Please give me advice!
Thanks again!!
Let met clarify that, in many cases, these links will be devalued by Google and won't harm you. It takes a lot of malicious links to cause a problem. If you don't suspect a penalty and if the competitor is just trying to cause minor trouble, your best bet may be to leave it alone. If a lot of links are coming from a few sites, I'd contact the sites. Often, they don't like spammy links any more than you do, and they might be able to ban a couple of users and solve the problem.
Thanks Dr. Pete for your response!!!
Removing possible bad links is a lot of effort to try and solve something that Google probably doesn't do - ie. penalise your site for certain links instead of just discounting them. I just concentrate on what they definitely do, which is count good links.
Really good info about the problem soem of my clients faced when they approached me for the first time. I was actually stunned to see how badly they been linked and tricked. They were complaing that they got a high boost in their ranking and then it started falling down and were very desperate. Its a nightmare if you have hundreds and thousands of back links linked to spammy sites. Approaching google is an effective way to get rid of such links. I'm still struggling to save my clients and to get their ranks higher. I must thank for the post. Thank you very much for the tips.
Great post.
If you cut the page and 301 it to a new page does that not pass on the value of the bad links to the new page? I noted you said if it's not too severe a penalty - how would you work this out or do you think this is it just a gut feeling thing?
The idea of creating a robot file that would let a search engine to know to exclude certain links from the link graph is a great idea - although I do wonder if that would exclude people that don't have the technical know how - maybe it would be best to add this kind of facility to Webmaster tools etc?
I'm honestly not 100% sure this works on the page level - in most cases, I think you'd have to 404 the page. I'd suggest it in cases where the penalty is clearly targeted to just that one page and isn't gratuitous (you haven't bought dozens of paid links, for example). We've certainly seen cases where 301'ing to a new domain killed off a link-based penalty, but even that's not guaranteed. Google's behavior is a bit inconsistent, and 301s have clearly been abused.
Hey Dr.P
I like your idea, however I doubt this is likely to ever happen due to it giving to much away as we would be able to easily test exactly what a link was giving us by blocking all links and testing results for a single link.
I know we can test things out but this would make it far to easy for us and the results would be more realistic.
Great post though and something to think about.
See my comment just above, but I'm definitely not under the illusion that this is likely to happen. Mostly, I thought it would make for an interesting debate. I'm honestly hoping to see some approaches that I haven't thought of yet in the comments.
Good post Dr. Pete , The most intresting part for me in the whole article is the cut the pagesection. I havent heard anyone suggesting like this to cut the page if its a deeper one to avoid all spammers trying to dumb us. But as u mentioned most guys will try to link spam links more to homepage, on such instances we have to think and act differently or someone have to comeup with an efficient plan to keep the spammers away from our area..
Thanks for great post !!
Great post indeed.
I think that your idea about blocking incoming links on a robots.txt level is awesome and something that search engines should really take in consideration.
Naturally it would require some tweaking considering the objections rised by Jamie_Griff but still i think it is a good idea.
Great post Pete, don't know how you do it, so many great posts from you recently.
What do you personally specify as a 'bad link' though? A link from a site with low Page or Domain Authority, as some are suggesting above? I think a lot of the time you'll end up with a ton of links to a site from websites with low PA/DA and this can't be avoided, it's only natural that the bulk of your links will come from 'low authority' sources.
Instead I imagine you're talking about links from clearly unnatural sources, link farms on the same IP, adult sites, that sort of stuff - actual 'bad' links rather than 'low quality' links - important to know the difference.
I like your suggestion for robots.txt to eliminate bad links, something like this would be ideal if you end up working on a site which has truly been hammered by poor SEO tactics previously. Would think it's unlikely to ever be seen, but here's hoping.
Yeah, generally, by "bad" I mean clearly manipulative - paid links, link farms, link circles, injected links, etc. Low-value tactics, like dofollow comments on irrelevant blogs with 100s of comments are almost always just devalued. I have seen cases where new sites ran into deeper problems from low-value link-building, but that's usually because it was 90%+ of their link profile.
Thanks Dr. Pete for the great post.
I am just curious guys.
Question:
Let's forget the technical criteria for bad and good links.
How about this, a great backlink with relevant anchor text within a bad review, a bad comment, a bad blog post, from a high authority, highly relevant website.
Will you consider this as a bad link? and therefore, block it? or...
Your thoughts?
Thanks.
I'm having a hard time picturing that link in real life ;) As the Decor My Eyes scandal revealed, to some extent, a good link in a bad review is probably still a good link. A link from a high authority, high relevance site (unless it's clearly a paid link) is probably a good link in Google's eyes. I don't think you'd want to remove links like these, in most cases.
Thanks Dr Pete for the feedback.
Therefore, a bad review won't always hurt you(it may even be good for you), and BADis not always harmful.
And what's a bad review in one person's eyes might be positive in the eyes of another person. One person might write that a dentist is overly chatty and a bit slow, yet another person could say that same dentist makes them feel at ease and not rushed.
Amazon has a "most helpful one star review" highlight area on many of their products. I have actually purchased products based on the negative review, as what was negative for that person wasn't a bad thing for me (the loudness of a paticular product).
I know this is getting a little off topic of Dr. Pete's post here, but I believe not all negative reviews are 100% negative in all situations.
Thanks Keri. :)
And in my eyes, your feedback is stellar. ;)
Nice post but...
Inbound links rarely cause penalties. Spammy links would get devalued or just ignored. However, building too many links to a page can get the page sandboxed for 1-3 months. If inbound links could invoke penalties then link building would be about getting your competitors penalised. Therefore, disconnect in robots.txt isn't necessary at all.
One big omission in the checklist to make sure that your site has been penalised is to check your outbound links. Linking out to a spammy site can indeed invoke a penalty. There are several tools that can collect all outbound links but the process of figuring out which one may has harmed your site unfiortunately is manual. Just make sure your site doesn't linkj out to sites which are condidered as spam such as anything about poker, porn, viagra etc.
Good point on the outbound linking. As I've commented above, I agree that, in most cases, bad links aren't the cause of people's problems. Unfortunately, I've seen those rare cases more frequently the last couple of years. It does usually take a large-scale effort.
Curious to hear people's thoughts on social bookmarking sites, member profiles & directories - good, bad or just plain ugly?
These are pretty common in backlink profiles - are they just ignored by Bing / Google or does PA/DA come into play?
How can one differentiate the bad or good links? do you have any parameter to guage bad or good links?
try some site operator tags in google - info:badsite.com etc - if google doesnt hold a cache of your website - then maybe that is a place to start!
The advice given in this article is flawed on many fronts. I posted a comment below, read it for more details.
Devaluation of links has been happening for many years. The new notification that Google is sending doesnt change anything. Submitting for re-consideration is the absolute worst thing to do, it will only make things worse in 99% of the cases.
You are absolutely correct. Reconsideration should only be done if the site has been completely de-indexed, otherwise you are just volunteering information.
Admittedly, "bad" covers a lot of ground, and we use a lot of language to mean similar things. When I think bad, I think paid links, link farms, link circles, obvious (large-scale) link exchanges, and other deliberately manipulative tactics. Then, there are the "low value" links, which are really just spammy - comments on irrelevant blogs, articles you spin out 50 times, etc.
There's not magic formula - Google sees bad links through the link-graph as a whole, which is why they have to rely on patterns, in most cases. We typically have to spot bad links the old fashioned way (one by one). Personally, though, I find patterns aren't that hard to find. It's amazing how often I can spot a bad pattern in Q&A in 10-15 minutes by just browsing links in Open Site Explorer.
Khalid,
One tool I use for finding bad links is Buzzstream. They provide two important indicators for "spammy" links and that is the number of external links on the page (spammy links like auto approves usually have hundreds or thousands of links per page) as well as whether words like porn, poker, viagra, etc ... appear on the page.
Bigwillg, thank you very much for your kind suggestion.
How did you use Buzzstream to find broken links? I know that you can type in black list words, but is there any way to pull up the sites that do have a large number of external links? We're trying to identify spammy links the previous owners of our site built.
I work in the affiliate field and so I see this often. If anyone wants to know how to check for panda penalty, pagerank penalty, how to lift penalties, how to surgically remove links pointing to you, how to analyze and find bad links of all the various different kinds, and most other onpage or offpage issues, just contact me. I usually do the analysis for free (that's how I know all of this).
Hello there
I would actually like to take you up on the offer you had made on the SEOMOZ blog in regards to providing some assistance with google penalties, the panda penalty, etc...
After the latest algo change I have been hit fairly hard, and am willing to pay for some assistance weeding out the bad links and so forth...
My email address is [email protected]
Hope to hear from you soon...
Kind regards,
Adam
I would also like to take you up on your offer. I am having a hard time determining what has affected my sites. I have some ideas but not sure which is affecting the most.
What is the best way to contact you outside of the forum?
my site www.westchestercomfort.com was in top ranking but from long time its very down, not a single keyword rank, can anyone can help
when i build external link, can i use same description for same page with related keyword
for example : keyword "Industrial switchgear"
can i use same description for industrial switchgear page to build many links
You can check it from here: https://www.bad-neighborhood.com/text-link-tool.htm
that tool checks the links on your site, not the links pointing to your site...
I hope Google listens, the "Disconnect" would be greatly appreciated. It's too easy for competitors to attack a site with bad neighborhood links. Webmaster's need an option, for protection.
I am in love with your robots.txt proposal, though it seems like it could be used to manipulate the link graph. I don't know, every example I could think of also had an easy answer, so... it just seems like it could be the next generation of PageRank sculpting.
Anyway, great post, thanks for the ideas. :)
Good read, Dr. I especially champion option 4, being a content writer and PR advocate, though a client will have to understand building new, better links takes time. The cutting the page option is unfortunate as far as link building but perhaps the content could be saved or updated, saving some time in the future regarding new copy generation. This post is good to have as a reference piece to show to clients with little experience and a lot of anxiety over former, ill-directed campaigns. thanks
To this day I have NEVER seen adverse effects from 'bad links' unless they were purchased from a linking network (in which case the remediation is easy - just tell your client to cancel their account and beg reinclusion forgiveness!). This is my experience with 100's of clients over the years, but I would love to hear an account of someone being penalized for 'bad links' that weren't paid* for...
*due to the Overstock penalty, we must assume that offering discounts & such for links is considered 'paying' for a link
I have seen a couple of examples of malicious link-building harming someone, but as I mentioned above, I can't provide those details due to confidentiality considerations. That's a barrier we often run into in public forums (although I've certainly heard confirmation in private conversations with other SEOs). Granted, it is rare.
The more common situation is something like link networks, farms, paid links, etc., where a 3rd-party created the problem and you no longer have the contacts or login credentials to solve it. That's the main scenario I'm talking about (which may not have been clear) - the remediation is only easy if you know where and how those links were created. If you hire a shady firm and then realize 6 months later what they did, it gets a lot tougher.
WOW! Thanks for the great advices, I've got a site packed of bad links to cope with, it's quite performing in rankings but it had been injected of spammy links by the previous person who was working on it.
No moral judgement intended, but it's frustrating to work on others' malicious activities. It's not easy to recover and it's not easy to explain to those who pay you that it could NOT be your fault if something terrible will happen now or then to their site.
Thanks to you, dr. Pete, that task could be much easier to perform.
Proposal sounds, good, lets hope it gets accepted. As it’s a good idea and can solve many issues for website owners, that acquire links that are not necessary needed or benefits them in anyway of form.
Dr. Pete !! Great post ..
BTW I like the last one option of reobots.txt because mostly not having a access to that type of bad sites but having a access to our robots.txt .
Very good idea to use robot.txt! Thanx
Great post Dr. Pete! Loving the graphics you guys have been doing :D
I just want to ask one thing. How far do you go to make sure a bad link is taken down? Do you ever threaten? (i.e. I have the rights to this content or this image, so take it down or I'll bring my lawyer into it)
In most cases, any given bad link isn't going to be worth it. It takes a clear pattern of bad links to cause problems, and you're better off dealing with the low-hanging fruit. The exception would be if the link is causing you issues beyond SEO (like reputation management).
+1 to Khalid Irfan's Question
How can one differentiate the bad or good links? do you have any parameter to guage bad or good links?
Sometime i seriously doubt the parameter of being a bad link
Awesome! I think every SE professionals should read the post. I personally like it too.
I Think this will be a best way to recover our website from bad links(adult liks) and we can revocer our website from panalty from Google. But honestly says i thing google calculate every incomming link as a good links for website
thanks for nice information this will sure help all us
Thanks for helping us in improving our online presence by letting us know these points here with us.
Awesome post.
I love the idea of te robot.txt. Only had this problem once and I used the last tip.
Hope to read more on great items like this.
I also like what GrowTraffic says about webmaster tools ability to block links.
Hey !, loved the simple proposal !, seem so simple !, good job !
Wow! nice post, I have never thought of this before, for all theses great posts, I feel confortable using SEOmoz.
@Pete, It's a great post! still I never try with this 'robots'. By the way how we can indicate this one is bad or good? by do follow and no follow?. if yes then I'm trying to do follow only.
How can i differentiate between good links and bad links? let suppose i have 1000s of links now i have to evaluate bad links then how could i find these links?
There can be so many ways to do so, for instance:
It is never said that it does not consume much time, but to my experience you can always cut the list down by reviewing it in excel.
Analysing the PA of a page from where you are getting link... Ageed with Ssan! Its not impossible but it will be frustrating like anything!
Yes i know these all process but for 1000s of links it is impossible to check manually each and every link!!!