What if you owned a paid directory site and every day you received emails upon emails stating that someone wants links removed. As they stacked up in your inbox, whether they were pleasant or they were sternly demanding you cease and desist, would you just want to give up? What would you do to stop the barrage of emails if you thought the requests were just too overwhelming? How could you make it all go away, or at least the majority of it?
First, a bit of background
We had a new, important client come aboard on April 1, 2013 with a lot of work needed going forward. They had been losing rankings for some time and wanted help. With new clients, we want as much baseline data as possible so that we can measure progress going forward, so we do a lot of monitoring. On April 17th, one of our team members noticed something quite interesting. Using Ahrefs for link tracking, we saw there was a big spike in the number of external links coming to our new client's site.
When the client came on board on two weeks prior, the site had about 5,500 links coming in and many of those were less than quality. Likely half or more were comment links from sites with no relevance to the client and they used the domain as the anchor text. Now, overnight they were at 6,100 links and the next day even more. Each day the links kept increasing. We saw they were coming from a paid directory called Netwerker.com. Within a month to six weeks, they were at over 30,000 new links from that site.
We sent a couple of emails asking that they please stop the linking, and we watched Google Webmaster Tools (GWT) every day like hawks waiting for the first link from Netwerker to show. The emails got no response, but in late May we saw the first links from there show up in GWT and we submitted a domain disavow immediately.
We launched their new site in late June and watched as they climbed in the rankings; that is a great feeling. Because the site was rising in the rankings rather well, we assumed the disavow tool had worked on Netwerker. Unfortunately, there was a cloud on the horizon concerning all of the link building that had been done for the client prior to our engagement. October arrived with a Penguin attack (Penguin 2.1, Oct. 4, 2013) and they fell considerably in the SERPs. I mean, they disappeared for many of the best terms they had again began to rank for. They had fallen to page five or deeper for key terms. (NOTE: This was all algorithmic and they had no manual penalty.)
While telling the client that their new drop was a Penguin issue related to the October Penguin update (and the large ratio of really bad links), we also looked for anything else that would cause the issue or might be affecting the results. We are constantly monitoring and changing things with our clients. As a result, there are times we do not make a good change and we have to move things back. (We always tell the client if we have caused a negative impact on their rankings, etc. This is one of the most important things we ever do in building trust over time and we have never lost a client because we made a mistake.) We went through everything thoroughly and eliminated any other potential causative factors. At every turn there was a Penguin staring back at us!
When we had launched the new site in late June 2013, we had seen them rise back to page one for key terms in a competitive vertical. Now, they were missing for the majority of their most important terms. In mid-March of 2014, nearly a year after engagement, they agreed to do a severe link clean up and we began immediately. There would be roughly 45,000 – 50,000 links to clean up, but with 30,000 from the one domain already appropriately disavowed, it was a bit less daunting. I have to say here that I believe their reticence to do the link cleanup was due to really bad SEO in the past. They had, over time, had several SEO people/firms and at every turn, they were given poor advice. I believe they were misinformed into believing that high rankings were easy to get and there were "tricks" that would fool Google so you could pull it off. So, it really isn't a client's fault when they believe things are easy in the world of SEO.
Finally, it begins to be fun
About two weeks in, we saw them start to pop up randomly in the rankings. We were regularly getting responses back from linking sites. Some responses were positive and some were requests for money to remove the links; the majority gave us the famous "no reply." But, we were making progress and beginning to see a result. Around the first or second week of April their most precious term, geo location + product/service, was ranked number one and their rich snippets were beautiful. It came and went over the next week or two, staying longer each time.
To track links we use MajesticSEO, Ahrefs, Open Site Explorer, and Google Webmaster Tools. As the project progressed, our Director of Content and Media who was overseeing the project could not understand why so many links were falling off so quickly. Frankly, we were not getting that many agreeing to remove them.
Here is a screenshot of the lost links from Ahrefs.
Here are the lost links in MajesticSEO.
We were seeing links fall off as if the wording we had used in our emails to the sites was magical. This caused a bit of skepticism on our team's part so they began to dig deeper. It took little time to realize the majority of the links that were falling off were from Netwerker! (Remember, a disavow does not keep the links from showing in the link research tools.) Were they suddenly good guys and willing to clear it all up? Had our changed wording caused a change of heart? No, the links from Netwerker still showed in GWT; Webmaster Tools had never shown all from Netwerker, only about 13,000, and it was still showing 13,000. But, was that just because Google was slower at showing the change? To check we did a couple of things. First, we just tried out the links that were "lost" and we saw they still resolved to the site, so we dug some more.
Using a bit of magic in the form of a User-Agent Switcher extension and eSolutions, What's my info? (to verify the correct user-agent was being presented), our head of development ran the user-agent string for Ahrefs and MajesticSEO. What he found was that Netwerker was now starting to block MajesticSEO and Ahrefs via a 406 response. We were unable to check Removeem, but the site was not yet blocking OSE. Here are some screenshots to show the results we are seeing. Notice in the first screenshot, all is well with Googlebot.
But A Different Story for Ahrefs
And a Different Story for MajesticSEO
We alerted both Ahrefs and MajesticSEO and neither responded beyond we will look into it canned response. We thought it important to let those dealing with link removal know to look even more carefully. Now August and three months in, both maintain the original response.
User-agents and how to run these tests
The user-agent or user-agent string is sent to the server along with any request. This allows the server to determine the best response to deliver based on conditions set up by its developers. It appears in the case of Netwerker's servers that the response is to deny access to certain user-agents.
- We used the User-Agent Switcher extension for Chrome
- Next determine the user-agent string you would like to check (these can be found on various sites, one set of examples can be found at: https://www.useragentstring.com/. In most cases, the owner of the crawler or browser will have a webpage associated with them, for example the Ahrefs bot.)
- Within the User-Agent Switcher extension, open the options panel and add the new user-agent string.
- Browse to the site you would like to check.
- Using the User-Agent Switcher select the Agent you would like to view the site as, it will reload the page and you will be viewing it as the new user-agent string.
- We used eSolutions, What's my info? to verify that the User-Agent Switcher was presenting the correct data to us.
A final summary
If you talk with anyone who is known for link removal (think people like Ryan Kent of Vitopian, an expert in Link cleanup), they will tell you to use every link report you can get your hands on to ensure you miss nothing. They always include Google Webmaster Tools as an important tool. Personally, while we always use GWT, early on I did not think GWT was important for other than checking to see if we missed anything due to them consistently showing less links than others and all of the links showing in GWT are usually showing in the other tools. My opinion has changed with this revelation.
Given we gather data on clients early on, we had something to refer back to with the link clean-up; today if someone comes in and we have no history of their links, we must assume they will have links from sites blocking major link discovery tools and we have a heightened sense of caution. We will not believe we have cleaned everything ever again; we can believe we cleaned everything in GWT.
If various directories and other sites with a lot of outbound links start blocking link discovery tools because they, "just don't want to hear any more removal requests," GWT just became your most important tool for catching the ones that block the tools. They would not want to block Google or Bing for the obvious reasons.
So, as you go forward and you look at links with your own site and/or with clients, I suggest that you go to GWT to make sure there is not something showing there which fails to show in the well-known link discovery tools.
Thanks for sharing Robert,
I've bumped into the same thing recently. I noticed that ahrefs and majestic were reporting lots of lost links - only to find that when I manually investigated the links were still there. Seems as if it wasn't just a one-off.
It's a good reason why you should never just trust headline metrics without validating them.
I think the biggest issue here is still the view that webmasters are being held responsible for links pointing to their site - but I'll resist the urge to start ranting about that.
The point is - if they are going to hold webmasters responsible - they must provide us with better tools to review/manage backlinks to out sites. Webmaster tools should provide some kind of quality metric or warning if a link isn't the kind of link that Google wants.
Doug,
We share a lot ot the same views. It is interesting to note that originally, with this client I thought a competitor had done the linking to them because it was sooooo obvious. But, when we finally got the records that were available from the previous people who stated they did SEO, they had them as a directory they had gone to.... Sucks that it wasn't a competitor.
Interesting thought on G providing a metric. Since I like you, please do not hold your breath ;)
Best
I don't think that's ever going to happen. Google is more likely to move away from relying on links as a metric than they are to educate the masses on what links they deem "good" so that people can manipulate them. Google is in the game to make money on ads, not help webmaster's rank their sites : )
Firstly thank you for sharing your experience Robert!
We've had similar issues with some of our Google Penguin & manual action campaigns, that's why we ended up disavowing links /domains which appear to be gone if they are deemed 'bad' links.
I'm not sure how manually you like to do your link analysis, but we always like to review them in the browser just to make sure.
Spotting the links on a page can be hard and take more time than is needed, as I'm sure everyone knows, so we recently made a free Chrome Extension to help speed up the process. If anyone is interested in this we currently have a beta version for people to test. https://www.spamflag.com/beta-manipulative-link-identification-chrome-extension/.
We use this combined with a User Agent switcher Chrome Extension (such as https://chrome.google.com/webstore/detail/user-age... We then set the user agent to Google bot 2.1:
Mozilla/5.0 (compatible; Googlebot/2.1; +https://www.google.com/bot.html
)This can be downloaded from https://techpatterns.com/downloads/firefox/useragent_switcher_agents.txt
If anyone wants to try it and give us some feedback so we can make it better we'd love to hear from you.
Nice tool Martin!
For those into Firefox there is a very handy plugin called Find That Link.
It can be combined with the User Agent Switcher for Firefox to check for links as Googlebot etc.
Flem can also be a big time-saver. It can open up multiple URLs on Firefox straight from a .txt or .csv file.
Modesto,
Between you and Martin these are some nice "shortcut" tools. Really appreciate the information.
Absolutely "spot on" Martin.
Any link you wanted removed should be disavowed...it is worryingly common for dead links to be resurrected for all manner of reasons...so common that we have a "rechecker" at rmoov to catch them and add them back into link removal campaigns.
Spamflag is excellent and has made manual analysis in chrome so much less tedious! Thanks to you and your team for building it.
- Sha
I have to agree with Sha on this. Good point Sha. Thanks
Hi Robert, Thanks for the post. Majestic calling. Our reply may have been a canned response, but we have very much been aware of this issue for a while (which is why we HAD a canned response!) :) - watch out for something awesome about this from us soon!
Actually, the first response was: "Your request has been solved?" then, "PR will call you." All I was seeking was a reasonable response from someone with knowledge of SEO. Bummer. I actually thought notifying you was a responsible thing to do. I know if we have an issue I am happy when my clients contact me about them. I do hope you have something awesome coming and I will look forward to it.
Bestt
Really? Can you please tell me the ticket number?
Yeah, I actually use Link Research Tools Link Detox to scan for bad links and I had a lot of suspect links on my first run. They suggested using other backlink resource, but I didn't use any at first. I was running these every month and after a few months I didn't notice many new links being found that I didn't either disavow or clear as "good links." However after month 3 I decided to export that WMT link spreadsheet and uploaded it to LRT and it definitely found some toxic domains that weren't on their list. So I would highly suggest people doing disavows to use any backlink report they have access to including any old reports from link building services you used in the past. I would have never guessed that toxic directories and link networks would start blocking sites like ahrefs.
I think that is good advice.
In Netwerker, I couldn't help seeing the word twerk more than anything else.
Oh, I saw a lot of things in Netwerker...just can't post them here.
GWT as a backlink source is critical as you mentioned, that's a good message for all webmasters. In regards to this netwerker.com, and it's weak attempt at being clever - what a waste of time. I mean why even bother? They are exposing footprints all over the place (adsense code) and didn't even think about how easily someone would find out. Meh, anyway, because of GWT there is no real way to hide links from webmasters, but you can do smarter things to confuse the big link tools (which I won't disclose ;).
Well put David, but here is what I think is the real issue: I think they just got tired of the emails. Think about how all worked pre Penguin. These guys could make a little money doing what they do "helping" people get links. They were rarely bothered and (this is a guess;) they probably didn't need a huge customer service staff. Now, Penguin hits and they are getting a thousand emails a day (and that may not be an exaggeration). This wasn't even trying to be clever; I think it was really them throwing up their hands and screaming at the sky, "I can't take it any more!"
Thanks
Hey Robert,
Blocking and other behaviors exhibited by webmasters are fascinating to me...just when I think I have seen it all I find myself staring incredulously at yet another bizarre and convoluted "solution" that someone has dreamed up. I'm even more stunned by how much time and effort people are willing to expend on coming up with these techniques.
What many completely overlook is that where links no longer exist, allowing crawlers will enable tools to recognize their absence and stop further email from being sent.
The pity of it all is that we have been providing a 5 minute fix for Publishers since early last year, but most seem so wedded to their angst they would rather spend significant time adding blocks or creating bogus redirects than taking 5 minutes to verify a domain, add a note and get on with life :(
Whether publishers want to remove links or not, whether the requests they receive are misguided or not, the Verified Publisher system is designed to help everyone. Even better, it requires little time, little effort and absolutely no negative energy to put in place :) (rmoov.com/rmoov-letter-to-webmasters.php)
We love our verified Publishers (rmoov.com/rmoov-webmaster-verified.php) because frankly, they "get it" and we think they deserve a break wherever we can find a way to give it to them.
-Sha
Did not know about verified Publishers. I shall check it out. Thanks Sha.
I kept getting linked to by Russian porn sites and nothing could be done. I finally got a robots.txt to block them, but now that's getting thousands of hits a month. Is that a good thing or a bad thing? The nice thing is my overall traffic has gone up and all those spammy sites are now gone. I'm just worried a large chunk of that traffic is not really traffic but that dang robots.txt.
Thanks for the response Greg, I am assuming that the robots.txt is blocking thousands of hits from the porn sites? If so, that is great. I don't understand what you mean by: ...worried a large chunk of that traffic is not really traffic but ...robots.txt. With robots.txt it is an exclusion protocol that keeps a robot from crawling there. I would look at the traffic in analytics and look specifically at referrers, organic, paid, direct etc. If you look at your landing pages in GWMT you should be able to see where traffic is hitting. By using analytics you should be able to get some peace of mind as to the origin of the traffic.
This is a bit different than what I am speaking to in the post, but I hope it helps you out.
Thanks!
Really a very good post Robert. Mostly I use data from GWT, Ahrefs and OSE. From my experience I have seen that backlink data from WMT is enough to revoke a penalty. But I still go with all the data I get from different tools just to be sure. A lot of links shown on these tools are no longer live.
Data from WMT + Combined data from 3rd part tools + Skill and experience = Successful penalty recovery!
Now, if we could please have a Penguin update!
As a new webmaster who doesn't know stuff like you guys, I found this highly informative. Thanks for the post.
Thanks SampagutiasDating
I have to say you are further along than I was when I first started out if you could stay with this. BTW - I got my marketing start in the dating industry about 100 years ago.
Best
Interesting observations but here is a one that will make you head spin some of these directories are blocking Googlebot so it technically makes it hard for Google to tell if the links are being removed without a manual check by the quality team. I know John Mueller has touched on the topic of Google needing to be able to crawl these pages to confirm the link is gone....
If the links are still there and they are just blocking Googlebot from crawling the page to see if they are really gone this helps more if they have been hit with a manual penalty and not so much for pengiun..
Also found all sorts of strange behaviour after we sent a link removal request, often we found the page we requested to be removed was replaced with a redirect to hardcore porn or spyware/malware. But if you started messing with user agents you could often still find the content/links, I think it's partly them having fun with SEOs.... which sux if you are just trying to get links removed.
It would seem if they were blocking the Googlebot that it would negate the entire purpose of having the directory. Are they trying to rank for example on Bing, Baidu, etc.? On the redirects, the links still show then and just redirect, correct?
The thing to remember is that many of these tools may be blocked - as a webmaster you might see masses of visits from the tools with no payback for your own site so block them to save bandwidth. It's not as Christopher said a bit sh#ty, just a case of saving your resources for the people that are likely to spend money with you or bring you those people - something that these link checking tools won't do.
And they might be blocked by robots.txt or even at the .htaccess level (the latter means that even if the bot ignores the robots.txt rule that they stay blocked and can't get onto the site).
Hey Robert great post thanks!
It really bugs me that Google wants people to clean up their links etc. but I always find WMT never really shows you enough details. thats 'if it's even working I've had a client with a penalty and WMT wouldn't even show any links other than in the notice.
I know a lot more blackhat sites are blocking sites like OSE, ahref and majestic to stop people seeing their links (and stealing them).
I hope Google with its talks about transparency start to show more of what it really sees in links should help link removal a lot more.
Thanks again.
Thanks Chris
We do not do as much link clean up as people like Ryan Kent, but I have not seen a huge amount of these occurrences and this was the first time I saw it like this; actually a staff member, Judy, caught it and showed it to our head of development, Aaron, who did the sleuthing.
I have only one of the Google transparency statements from an individual and assume it is coming more from individuals at Google than it is a corporate announcement. (LMK if otherwise, please.)
I am not a G hater and really see them as a business and they are public and by virtue of that they are forced to constantly play to "the street." That means if the stock is at $563 today, some analyst is touting it to $700 and some other one is shorting it to $400. It is a game that came as the result of wanting to monetize a property that was really making more money for others than for Larry and Sergey prior to them "wising up."
It would be nice if they were open to sharing things like links etc., but is it really a reasonable move when you think about it? As SEO pros we are actually better off if they keep business as usual*. Think about something really beneficial for those who want to use it correctly like authorship; I promise you that within a very short span of time, one half of the listings for ___ lawyer had some really bad picture of a lawyer (the blank is not for an expletive epithet ;). In other words, in competitive verticals every time Google makes a change, there are other SEO's and we ourselves who are trying to get someone "fully optimized." If I take a client who is paying for copywriting as a service and set them up with authorship, which in this case is reasonable on one level as it is a work for hire in copyright parlance, why was I doing it? Solely to improve his rankings or CTR. So, Google does this stuff and, even if we do so with the whitest of hats on, we exploit it to assist our clients or ourselves as it may be.
Yes, I too wish there was just a bit more actionable data from Google and a bit less of the generalities; I would not want to be Matt Cutts. I don't think they can change too much though as they will be exposed to gerrymandering from some of the white hats and most of the darker ones. But, hope springs eternal Chris. All the best and again, thanks.
*NOTE: Thank God I am not on one of the other SEO forums with famous people like Skibilly, willynilly, sillydill, ad nauseum or I would be cursed and crucified.
Thanks for the heads up! I had always assumed that GWT just hadn't updated itself to match what I was seeing from those other tools. After all, GTW doesn't work in real-time so I assumed that's where the number discrepancy came from. But your story makes it seem like GWT, even with it's lag, might be the best bet.
I think it is for all the reasons pointed out here. I don't think the discrepancy in total numbers comes from this though. I think this is just a secondary to it all. Typically, Google has less than tools like ahrefs and others. My point was that if you use those (and many of us do), we still need to pay attention to GWT.
Best
Killer post. Did their most precious keyword remain? What was the explanation for the improvement? Has the disavow file been recognized yet by Google or are the links still showing in WMT as of today?
The explanation for the improvement is probably more than one factor. Someone earlier pointed out that there had been no Penguin update and I did not mean to imply that had taken place. We cleaned up and were able to remove a lot of links, but not on the scale of the problem. I believe for some links and linking mechanisms, if you are being algorithmically affected and you change that issue, you will see a change in ranking. But, for disavows, until there is an update (let it be soon please!) no change. As to are they showing in WMT, even if there had been an update, links that are disavowed will still show, but Google ignores them for the purposes of "site value or authority." (ranking).
Thanks
Hi Robert,
Last year we had a Penguin penalty. Our agency submitted disavow file with links we wanted gone. However Google webmaster tools is still showing that some of the links (mainly directories) have over 1,600 links linking to our website. I questioned this with our agency and they responded saying that they’re in the disavow file so they no longer pose a threat. Would you agree?
Thank you for a good post on link removal and strategy. I am currently seeing a consistent linking from a particular link index site, and will definitely dig deeper into the issue, using some of your approach. This also emphasizes the need for investing in quality link scaling operations, and not link directory submissions.
Wow that's sneaky. For a long time I have used google WMT to get a big-picture link profile as they still provide the most diverse level of domains. AHREFS, Majestic, and OSE are catching up though. The fact that they're blocking these user agents leads me to believe that sites like this won't be indexed by google for very long - sending different content to different user agents is a direct violation of WM Guidelines, I think - even if it's just 404ing. Thanks for sharing, this will be VERY useful for a link cleanup i'm about to begin.
great writeup..thanks for this..
A very good case study to point out the need to not forget about G WMT when it comes to assessing back links - thanks Robert :-)
thanks for sharing your experience with us
Looks really good , some of google's updates are really fantastic but penguin ofc :D
Thanks for sharing this Robert
It's a little old, but it still amazes me how many people see these posts on Moz.
Where i can see more about this??? because it is new for me.
Thanks for the advice
You just made a clear explanation for what's happening right now. I could not figure out why Google Webmaster Tools behaves this way.
Hi, this is a great post. I'm wondering how I can get Moz to do a review of my site Learn Debt ? Can you please advise if you still accept new clients and, if so, how much you charge to review new sites? Thanks!
Hi! We haven't done consulting since 2009, but we do have a list of recommended companies at https://moz.com/community/recommended.
Hey Robert,
Great discovery! If negative SEO ever proves successful and becomes widespread they will likely use those masking techniques so one to keep an eye on. Just wondering why you waited for the domain to show in Webmaster Tools before disavowing it? Why not just put the domain into your disavow tool prior to it showing up?
Thanks,
ROI Marketing
Good question ROI,
When the client came on board (March/April 2013), they had previously had an issue with Google and their places account (roughly a year before 2011). Given they had no manual penalty and really bad linking tactics had been used, I felt that discretion was the better part of valor. In other words, I did not want to do anything that might draw attention to them prior to us having begun some of our changes for fear that someone at Google would want to take a "human look" at the site. That could easily have resulted in a manual penalty given the state of things.
Thanks,
Oh boy did I spend some serious time with my MajesticSEO account and GWTs trying to fix a ban, it took a lot of pain, headaches, coffee, tea and food to try and solve it. My beef with Majestic is that it just doesn't seem to be up to date with everything, so I'll disavow a link and check Majestic daily but it'll still be there at times. Or worse links will be increase from the disavowed link - somehow the site I was managing got involved with a Thai directory network - delete one link, 10 get added sort of thing.
Majestic is a great tool, but you are looking at this a bit wrong. A disavowal does not remove anything. Majestic, MOZ (OSE), Ahrefs, etc. cannot control who adds or deletes a link. If you look at any of these, you can find a new or lost links and I know for a fact you can in Majestic. So, for example with your Thai directory, you could simply monitor new links to see what they are adding.
With a disavowal, that affects ONLY Google and it does not remove it from even GWMT, it simply tells Google, "Don't associated this to me for positive or negative credit to my site." The link will remain though.
Hope that helps,
Robert
Robert is quite right. The best that Majestic (or Moz or Ahrefs or even Google for that matter) can do is to try and map the web, then if we are lucky, try to understand it. One way to think of how it is mapped is to imagine Google Street View cars running around the web, as if links were roads. Any car can travel, although some roads block some cars. Where Majestic gets blocked, our map stops.In other situations, Google may get blocked, although Robert has a very valid point in his post, that there are some limitations in using a map of the web other than Google's to try and deal with Penguin problems. there are also problems just using Google - especially when sites have more than 100,000 links (which is the maximum Google can show I believe). For this reason, using multiple data sources has some advantages.
Also - as Robert points out - just because a link is disavowed, or unseen by Moz, Majestic or Ahrefs does not mean it does not exist. The disavow file is a "private arrangement" between you and Google and third party tools are not privy to this information.
(Typed on a phone, please excuse any typos)
Dixon
A lot of what is involved in link cleansing and removal is pure common sense, which is what you need for a successful career in SEO.
Great article, some interesting take aways.
Nice Post Robert. Just one typo after "A Final Summary" paragraph. GWMT and GWT is the same right?
Yes, Thanks, If you knew how many times I edited this, you would say...and you missed this!?
Best
Not to mention 50 lashes with a wet noodle for the YouMoz editor who let that one slip by. Fixing now!
Briliant work. I am impressed with the fact you guys took your clients website so seriously. Thanks for laying out your experience this way. Really enjoyed reading it and felt i got a good idea of how you guys think and work. Great to see this kind of integrity in SEO.
Thank you David, we really try to take all of our clients this seriously. Best.
Nice one Robert. I always use Google WMT and aHrefs primarily now on link review and clean up - really interesting (and a bit sh#ty) find that a website was blocking the crawlers like that.
I've been noticing a lot of 'deleted' links myself. This one kind of slapped me upside the head. I didn't even think the benevolent easyseolinkdirectory4ulol.com would be so sneaky. Nice catch by your team. Thanks for putting this together.
Don't forget that majesticseo bot could be blocked by robots.txt, ahrefs bot, moz bot (yes yes!) so the base IS Google Webmaster Tools, not other tools for success in link clean up and lift penalty. Often I see missed by majestic, ahrefs etc links but not GWT. Maybe they have only sample but combined with other tools (and knowledge, experience, etc) will give You positive answer about manual action revoked or algorithmic penalty lift.
Well, to me you have to use a lot of tools so that you are sure you take care of as much as possible for a client. While robots.txt could be used, the robot does not have to obey it and I think it might be more obvious than what we saw. But, it is a valid point.
Very interesting. I will give it a try. Thanks.
Thanks for tell us your experience Robert! I recently wrote a tutorial on Google Webmaster Tools in my blog because I think that is a key tool for linkbuilding (and cleanup). Nor is it a surprise when is the searcher's tool... :-)
Great article Robert! I have used techniques like this during link discovery and cleanups; it's nice to see that I was heading in the right direction.
This is fascinating, thank you Robert. We've been under the belief that a competitor has been engaging in some kind of link-cloaking shenanigans and this helps lend credence to that, as well as provide some ideas on how to find out what exactly is going on.
Great post, thanks Robert. I've always used GWMT but only to get more data; I find there's usually a few extra ones in there worth knowing about. I collate data in spreadsheets as well as using LinkRisk to get an idea of the bigger picture.
Also, when I find unnatural links like that network, whether manually or from a tool, I disavow straight away without waiting for them to show in GWTM. This is purely because not all links show in GWMT even when Google knows about them (as we know from manual penalty example links not being listed in GWMT link data). I always use the disavow as a precaution in the first instance and if I get get the links removed/nofollowed too then all the better.
I agree Robert and I am happy that I found this information. Thanks a lot for sharing.
Great information for people who starting to build out new sites!!
Are you sure you were penalized by Penguin?
How do you start popping back up in two weeks then get full recovery before the next Penguin update?
I think due to taking some of the bad links off we improved what was happening with the site. We did not get a full recovery and we are still waiting like everyone. What we got was an improvement and I did not mean to imply it was a full recovery.
What is that redirect tool/plugin you're using?
I am not sure I understand what you mean by redirect tool or plugin?
It looks like you are using the Ayima Chrome redirect tool (https://www.ayima.com/seo-knowledge/redirect-checker.html) but for some reason it looked a bit different...didn't know if there was a new tool out there.
Blocked Bots To Block Inquires Simple
Oooh its great but Google Webmaster Tools also good
i agree with this.