Recently, Moz contributor Scott Wyden, a photographer in New Jersey, received a warning in his Google Webmaster Tools about some links that violated Google's Quality Guidelines. Many, many site owners have received warnings like this, and while some are helpful hints, many (like Scott's) include sites and links that clearly do not violate the guidelines Google's published.
Here's a screenshot of Scott's reconsideration request:
(note that the red text was added by Scott as a reminder to himself)
As founder, board member, and majority shareholder of Moz, which owns Moz.com (of which YouMoz is a part), I'm here to tell Google that Scott's link from the YouMoz post was absolutely editorial. Our content team reviews every YouMoz submission. We reject the vast majority of them. We publish only those that are of value and interest to our community. And we check every frickin' link.
Scott's link, ironically, came from this post about Building Relationships, Not Links. It's a good post with helpful information, good examples, and a message which I strongly support. I also, absolutely, support Scott's pointing a link back to the Photography SEO community and to his page listing business books for photographers (this link was recently removed from the post at Scott's request). Note that "Photography SEO community" isn't just a descriptive name, it's also the official brand name of the site. In both cases, Scott linked the way I believe content creators should on the web: with descriptive anchor text that helps inform a reader what they're going to find on that page. In this case, it may overlap with keywords Scott's targeting for SEO, but I find it ridiculous to hurt usability in the name of tiptoeing around Google's potential overenforcement. That's a one-way ticket to a truly inorganic, Google-shaped web.
If Google doesn't want to count those links, that's their business (though I'd argue they're losing out on a helpful link that improves the link graph and the web overall). What's not OK is Google's misrepresentation of Moz's link as "inorganic" and "in violation of our quality guidelines" in their Webmaster Tools.
I really wish YouMoz was an outlier. Sadly, I've been seeing more and more of these frustratingly misleading warnings from Google Webmaster Tools.
Several months ago, Jen Lopez, Moz's director of community, had an email conversation with Google's Head of Webspam, Matt Cutts. Matt granted us permission to publish portions of that discussion, which you can see below:
Jen Lopez: Hey Matt,
I made the mistake of emailing you while you weren't answering outside emails for 30 days. :D I wanted to bring this up again though because we have a question going on in Q&A right now about the topic. People are worried that they can't guest post on Moz: https://moz.com/community/q/could-posting-on-youmoz-get-your-penalized-for-guest-blogging because they'll get penalized. I was curious if you'd like to jump in and respond? Or give your thoughts on the topic?
Thanks!
Matt Cutts: Hey, the short answer is that if a site A links to spammy sites, that can affect site A's reputation. That shouldn't be a shock--I think we've talked about the hazards of linking to bad neighborhoods for a decade or so.
That said, with the specific instance of Moz.com, for the most part it's an example of a site that does good due diligence, so on average Moz.com is linking to non-problematic sites. If Moz were to lower its quality standards then that could eventually affect Moz's reputation.
The factors that make things safer are the commonsense things you'd expect, e.g. adding a nofollow will eliminate the linking issue completely. Short of that, keyword rich anchortext is higher risk than navigational anchortext like a person or site's name, and so on."
Jen, in particular, has been a champion of high standards and non-spammy guest publishing, and I'm very appreciative to Matt for the thoughtful reply (which matches our beliefs). Her talk at SMX Sydney—Guest Blogging Isn't Dead, But Blogging Just for Links Is—and her post—Time for Guest Blogging With a Purpose—helps explain Moz's position on the subject (one I believe Google shares).
I can promise that our quality standards are only going up (you can read Keri's post on YouMoz policies to get a sense of how seriously we take our publishing), that Scott's link in particular was entirely editorial, organic, and intentional, and that we take great steps to insure that all of our authors and links are carefully vetted.
We'd love if Google's webmaster review team used the same care when reviewing and calling out links in Webmaster Tools. It would help make the web (and Google's search engine) a better place.
OK...I've been watching this unfold and have not been able to get to a computer until now to comment.
It is not the SOURCE of a link that makes it unnatural. It's really two things:
-Intent to manipulate the Google search results
-Whether it is part of a large pattern of similar links
What I am saying by this is that Moz was not the cause of this unnatural link, but the pattern was the problem. It looks like Scott has done a good job at link cleanup as when you look at his link profile on OSE or ahrefs it's not horrendous. But, if you look at the link profile on the historic tab of Majestic you can see that there are, for example, 59 domains linking to him with the anchor text "New Jersey Photographer". I didn't take the time to dig in to the quality of those links but it's unlikely that he received 59 naturally earned links with this anchor text.
My point is that the webspam team didn't go through each link and say, "Ah, this is a link from Moz. This is bad." Instead, they picked out the patterns that they could see and chose some examples from there. Perhaps guest posting on a large scale for links was a pattern. Perhaps excessive link exchanges were a pattern. Again, I didn't dig into the causes, but the key is that the problem here lies in the patterns of the links that are pointing to his site.
I have had several occasions where Google has given me an example link on a failed reconsideration request where I KNOW the link is natural. I'm not talking about a keyword anchored link in a guest post, but rather, a truly natural mention where the person mentioning my client happened to use a keyword. The first time it happened I was enraged. How Could Google call THIS an unnatural link? But, when I looked closer, there were other links that were either purchased or self made that fit the same pattern. What we do in cases like this is keep the ones that we know for certain were natural mentions but we are sure to remove or disavow the self made ones. In every case we have been able to get our penalty lifted despite keeping these example links.
I've yet to see a site that received an unnatural links penalty that was unwarranted.
Should we all stop posting on Moz and Youmoz? Nope. I post regularly, and yes, I link to my site when I do so. But, the link is always one that makes sense for users and I almost always get crazy referral traffic from these links. In the article in question, I wonder how many people actually clicked on the link to Scott's site? Did it make sense to be there for readers? Or was it only there so he could get a followed link from a high PR site?
Interesting that you could disavow/cleanup links with the similar pattern to the one Google point out but actually keep the specific link they point out. The e-mail infers that the links are unnatural, rather than the pattern they fall into. Worth not disavowing the examples they give initially and look for others which fit that pattern. Got my brain whirring on that one. Good response, Marie.
The only thing Google can use is a pattern. They can't evaluate every source on the web so they identify trends in the larger data. When they find a trend that improves quality as a whole, they apply it. So long as their algorithm adjustment improves rankings more than it harms them, they will implement it.
False positives will happen, which is the main reason why technical SEO will be very important for a long time.
Agreed - but keep in mind. This is a reconsideration response that is handled by a real person. One would think a Moz link in the sample should be noticed and discarded/replaced.
Is this a sign of an oversight by the reviewer or a deliberate jab? I think probably a human error that is forgivable. It's easy to see how it might be missed by a reviewer and/or defended as a pattern of bad behaviour should they stand by the sample link.
Hey Chase ! We know that a real person handled the Reconsideration request and hope that " In the Google team", everyone knows about MOZ (SEOMOZ) and its guidelines, and we can;t ignore that by saying "Human Error".
You'd think that a human reviewer at Google would know what Moz is and the standards the site has set for itself. I can understand a few false positives when a natural link looks like it might have been manipulated in some way because it follows a certain pattern, but a site like Moz should be well known to the reviewers at Google. It would be like calling out a link from Huffington Post.
Exactly that one link from Moz didn't cause the penalty it was persistent guest blogging - Moz was just part of the pattern of behaviour. He could possibly challenge that *one* but as you say they have at least 59 more examples to go back with!
https://twitter.com/ChrisLDyson/status/491867330868441088
Hi Marie - if it is the case that Google Webmaster Tools examples aren't inorganic, non-editorial links that violate their guidelines, but are, rather, examples that might point a site owner to a pattern of other links, then they should be clear and transparent about that. As it is currently labeled in their platform, what you're saying (which sounds to me like it's accurate) is not what they're saying, and thus, confusion and frustration become byproducts. If it's a simple text change to the language to create transparency about how the process actually works, then I'd like to nudge Google to make that change.
You and me both, Rand. There are so many parts of the manual penalty process that are confusing and poorly worded. For example, many partial action notices tell site owners that they don't need to do anything because Google is taking targeted action on the bad links. But, in the same message they tell you that you should clean up the problem and then file for reconsideration. Which is it? Do nothing? Or spend thousands of dollars doing a cleanup?
Another example is the reconsideration request response that instead of saying "manual spam action revoked" or "we still see unnatural links" says, "We've processed your request". Now that we have the manual spam actions viewer, I can see that this is the message that gets sent when they downgrade a sitewide to a partial. But, it's still confusing.
Google made a step forward in communicating with webmasters when they implemented the manual actions viewer, but they still have a lot of room for improvement!
Not to mention that your reconsideration request can be denied and after doing nothing, your site-wide penalty is gone after 3 weeks. This have happened to me 3 times while 3 weeks was the fastest recovery. What's that for inconsistency and confusing answers?
I suspect all manual actions have a timeout, but why are they not renewed when a reconsideration request was denied? Shouldn't same aspects apply to both scenarios?
The confusion from unclear guidelines seems to devolve into a "prove you're not a witch" scenario. If you deny the accusation (penalty), that might be an admission of guilt (spammy practices), but then again, if you say nothing you're only stoking the fire your accusers (Google Webspam).
Is Google creating a Salem-esque culture by saying it's looking for patterns, which would seem to suggest some rhyme or reason, but not being open about what those patterns might be or what sites may trigger them?
"There are so many parts of the manual penalty process that are confusing and poorly worded..." Yup - fallible on so many fronts.
I had a client get a GWT response citing three "bad" links, where one was quite solid. I sent the disavow without that specific link and was entirely cleared. It's like shooting in the dark. Really makes you wonder how many internal resources and how much QA Google is actually putting into this process.
Matt Cutts has posted an article about the decay of guest blogging for SEO purposes to stop people from using guest blogging as a link building strategy for 2014 as it gets more spammy over time.
Maybe that blog post did not get all the webmasters - or at least a handful of webmasters - to steer away from this strategy that is why they need to enforce it with another move to really tell them to steer clear from guest blogging, and that they are serious about this warning.
Moz, as we all know it, is a reputable website where webmasters hang out, share their thoughts, and learn, made Moz a good target for Google to use as a passage to reach out to webmasters and share their "Stop Guest Blogging Campaign". Google is targeting the mobs. It just happens that Moz is a great way to send their message to everyone.
Moz is just the casualty of their campaign.
Not really.
You said yourself, in your opening paragraph, guest blogging for links is what Google are taking action on and is where the problem lies.
Moz manually screen every piece of content on YouMoz before it makes it to publication and only the best of the content makes it. The content that is really useful, valuable, shows significant research with facts, findings and results. All the boxes that any content on the web should tick.
I think that the destination of the link is also important and that when you are linking out into the SEO space a lot of the time you are hitting - or just one hop away from hitting - bad neighborhoods.
Moz has lots of links going out. Pay us 200 moz points and we remove the nofollow.
Perhaps this policy should be nofollow all of the time or none of the time.
We do have a lot of links. And I ban accounts on a daily basis that are spamming us. I haven't examined every profile link, but generally if someone is a spammer, we figure it out long before they hit the 200 moz points.
I think that if the link is always nofollow then some of the spam will be eliminated. Certainly not all.. but some.
Or, why not just publish the URL without the hyperlink?
All links in blog comments and Q&A are nofollow, and we still get plenty of spammers.
It does take a bit of effort to get to 200 MozPoints, especially for people who don't have Q&A access. We will take a fresh look at how good/bad it looks for people that have 200 MozPoints and what types of links they have.
To me, this has a lot to do with Moz's history. We've always wanted to support and recognize participation. When nofollow came out, it was a way to say "I don't trust the validity of this link and can't editorially vouch for it." We even had some conversations with folks at the search engines (Google, Yahoo!, and, at the time, MSN Search) about whether we should be nofollowing all our community links. We came to the conclusion that there needed to be an editorial bar, and that's how Mozpoints came about.
Today, if someone reaches that Mozpoints bar, we've seen them, we know who they are, and we've liked/appreciated their contributions to our community. To me, that's worthy of an editorial endorsement, and the link is a nicely scalable way to offer that.
What I do think we can/should consider is having a decay effect, such that if someone reaches 200 Mozpoints, then drops off, we should remove that live link after a period. That helps us to continue validating that there's participation and worth in the editorial endorsement.
I like the decay effect idea. It is recognizing active members.
"Legacy" links could be going to sites that are no longer cared for or not owned by the same person or that have simply changed focus.
I think 200 Moz Points cannot be the only benchmark for awarding someone a "followed" link from your website. What also needs to be looked at is whether the link is going to a good site with questionable credentials. Moz should have a policy on the kind of sites they want to link to regardless of 200 moz points milestone. This might discourage users that are out there just to get a followed link to their websites with questionable web properties in Google's eyes.
There is no problem if links are under no-follow, special when spammers hit in blog coments, no-follow is no-follow, couldn't impact that page at all.
Here's some more specifics about our quality standards for YouMoz. I've been leading a project to develop an objective scoring system for posts, using a rubric grading system.
There are seven main content attributes we look at for a post:
Each of these attributes can receive a score from 0 to 2 points, for a possible of 14 points for a post. Examples of 0 point, 1 point, and 2 point scores are included for each attribute. The total points for the post help determine if the post is to be declined, returned to the author with suggested revisions, or published with just minor editing and formatting improvements.
The rubric is available at https://docs.google.com/spreadsheet/ccc?key=0Amxjelq5QCTndGk1Zll3LWdLd1g2Smx6TDVBLUpraEE&usp=sharing and the team would love to hear your feedback.
It's still a little early in the morning in Seattle, so I'm going to try for a bit more sleep, and will respond to any feedback after I've had a chance to grab some coffee.
Hi Kerry. Cheers for the info. Out of curiosity, what did you score the YouMoz post in question?
The rubric is still being internally tested, so Scott's post came through long before it was developed. It would have received a high score, and would still be published today with these standards.
One huge issue was have been seeing recently is that the Search Quality Raters who are checking reconsideration requests clearly are not doing a in-depth job, it is probably evident in this case as well you can see Moz is a decent quality site.
We recently received a reply on one specific reconsideration where the 3 links which were pulled up by the search quality rater were actually all DEAD PAGES and after checking on way back these pages had been dead for a while, these pages were not even cached. It seems they are cross checking data and only spending a matter of seconds or minutes on analysis and moving onto the next one. Google needs to pick up its game in my eyes in terms of quality and process on reconsideration requests.
I second your thoughts. It seems same to me too, they probably randomly skim the links and without analyzing they provide as sample. Google need quality on its own end instead of demanding from us.
Either they should make changes to manual penalty process or change the quality raters.
Well here is a graph from Brian White (Google Search Quality) at SMX to show Google Reconsideration requests growing form 2006 - 2012, I would love to see the last two years on this graph my guess is that it is growing like crazy - https://d2v4zi8pl64nxt.cloudfront.net/1366915192_e1...
I do not blame the search quality raters who work at Google 100%, they are probably checking 1000's of sites a week, hence limited time to review sites. Yet Google has created this whole mess of manual actions for idiotic things such as the whole My Blog Guest fiasco. And the flow on effect goes to business owners who have to deal with this mess.
I had a client with exactly the same issue. What made it worse, there were at least 2 examples of links which not only came from dead pages but went to a 404 error on my clients site - a page that was deleted long before I started cleaning their links. Perhaps not related but as usual, the Google WMT backlink data for that client was horribly out of date!
the worst is the domain is expired, but the links are still in data for months.
James - I have had that so many times and it is the reason you need to have as much historical data at your disposal, whether it's old WMT reports, Majestic's historical data and any link building reports from previous agencies to hand.
I have had links that were dead for over 6 months and often pushing back on them does work.
nice to read
Can you say more about which part was nice to read? That's an unusual response for the content of this post and the comments.
@ChrisDyson yeah agree you really need to look at all types of GWT/ ahrefs/ majestic/ histroic/ other data/ old reports from spammer agency X ect ect, now with recons we hit Google with every thing Server non responses, dead pages, give them all the goodies +more and we do ;) don't worry. What I am saying is if it is a "Manual" review, you think they would "Manually" check on live vs. dead pages. Surely manual reviewers check some types of "guidelines" when they do their work???
Yep. You know those sometimes several years old links that show up in WMT? I think the manual raters are sometimes looking at very out of date databases.
I agree with you. I think it is their problem of "scalability" forced them to just "skim" through the links and randomly grab some links that are included in some "pattern" they used to identify manipulative links.
Their so-called manual raters are always in a rush. Maybe they get their salary on a "quota" basis. :D
Man, they need to increase their quality of work.
I want to clarify a couple things. I do NOT own the Photographer's SEO Community - I am just an employee of the company who does. Part of my job is interacting with the community on a daily basis, offering advice, writing articles on the site, etc.
In fact, all of the broad recommendations I received from Google were for my personal site, scottwyden.com, and NOT for the Photographer's SEO Community. As I said the SEO community isn't even owned by me and is NOT under my webmaster tools account at all. So really everyone discussing the SEO community link, in my opinion, is not part of the penalty in question. The only link in question is the one to my personal site since that is what the webmaster tools account sees that I own.
With that said - Rand, thank you for taking the time to write and share this. I have been so frustrated with the lack of help from Google getting this resolved, but I'm hoping it's coming to an end and the penalty will soon go away.
From all of my efforts trying to get this resolved, I've come to a few conclusions:
Scott, was going to do fresh comment, but some of this relates to what you might answer, so doing as a reply.
Let me start by saying I'd be as annoyed as Rand is if Google started telling people that Search Engine Land was a source of "inorganic" links. We have contributors; we take care to edit and be selective in what we allow. And ultimately, it's our site -- we'll decide what we think makes sense to have as links and how they should appear.
That said, Rand's title suggests Google is saying that Moz links violate guidelines. They clearly do not. The fact a link comes from Moz isn't an issue in this. Nor is it fair to say that Google wants sites like Moz to nofollow all guest links.
If Moz was the issue, Moz would have gotten a penalty or seen some decline in its rankings, in the way that some guest blog networks have. That hasn't happened to Moz.
More important, the notice you got says nothing about Moz being bad. In fact, if it were a Moz thing, people who guest post on Moz would be inundated with such emails. Moz has hundreds of guest writers over the years, but there's no sudden storm of such notices.
The problem is with your site, scottwyden.com. Google doesn't like it, in particular because there seems to be a pattern, from what Google sees, of unnatural linking.
The 3 examples you received clearly aren't all the links concerning Google. It's easy to take 2 of them and say Google is hurting you because of scraper sites and is off-the-mark with Moz.
Me, coming into this from afar, curious/concerned about what might be happening, I looked at your original article. The link is now removed that was the issue, as Rand noted. That was part of the paragraph where you wrote:
"For example, I wrote an article sharing the best business books for photographers. As you can see, my article is showing up second in the Google Alert preview."
And in that "best business books for photographers" was the link.
That's kind of odd. I mean, rather than make "article" the link -- which is the logical word to link to -- you went for the description, a link that was keyword-rich.
Why? I know why I'd do it -- I think like an SEO. In fact, I'll write my own articles and link back within our own site to supporting articles and make use of keyword-rich anchors, though it always has to make sense to the reader why.
Of course, I'm no fan that thinking like an SEO means you have to be held to a higher standard by Google, that it thinks "well, you should have known better." And if this was the only example of you doing guest posts, I think you'd have escaped that.
But I'm guessing your writing in places other than Moz. I took a very fast spin at Open Site Explorer. I see a Squidoo profile where you link to yourself with the word "photographer" on its own, feeling kind of odd. I see a Social Media Today article where in the bio, you describe yourself as a New Jersey Photographer -- with those three words as the link. If you've been doing a lot of that, it might -- along with photographers linking to you naturally -- have generated some suspicion pattern, getting a human to take a closer look.
Now flip things around. If you were at Google, and you were asked to review links pointing at scottwyden.com because something triggered a manual review, would you have looked at the link in the Moz article and thought all was fine? Or might it seem odd, especially if it's coupled with other things.
I'm absolutely NOT saying that you've been spamming Google. I think you might have, however, been doing guest blogging with an SEO's eye toward keyword rich anchors. Which wasn't a explicitly a Google crime until fairly recently.
From that, my takeaways are:
1) Moz isn't somehow less trusted now by Google
2) Guest bloggers who are writing for exposure, rather than links, still don't need to panic
3) Google isn't requiring guest blog posts to use nofollow (otherwise, again, Moz would have gotten a penalty)
4) SEOs who are blogging probably have to keep fighting all then natural best practices of the past to use keyword-rich anchors
5) SEOs who blogged in the past with keyword-rich anchors may have that catching up with them. Which sucks being punished after the fact for something Google now dislikes
I hope you get it all sorted out. I'm sure someone from Google will take a closer look after the attention this raises. I just don't feel people need to panic that Moz or any reputable site can't have links. We can.
Sadly, we also have to keep walking along the insane lines Google keeps painting. Sometimes it's like those light cycles from Tron, where you try to avoid the walls, but they keep coming close to you.
Danny,
You have nailed it here; especially beginning at "And if this was the only example of you doing guest posts, I think you'd have escaped that." and, including, "Social Media Today article where in the bio, you describe yourself as a New Jersey Photographer -- with those three words as the link".
The key issue behind the Moz link to Scott's photography site is that Moz, and the topics/community it is associated with (in Google's perception), is not photography. Topical relevancy of interlinking between sites has always been important to a organic link profile, as far as I know and please correct me if this is wrong.
Granted Google will let you stray from the community/topic of your website; however, there is some "within reason" threshold that they keep you to and when you break that threshold the alarm sounds, red flags are thrown and unnatural linking messages are released.
This is not the first time I have seen this scenario played out and is increasing as of late. Topical relevancy of inbound links to your website is important to Google and when you stray to far from what is relevant it can cause exactly what Scott is experiencing.
I don't disagree with your comment, just want to respond briefly to this:
"For example, I wrote an article sharing the best business books for photographers. As you can see, my article is showing up second in the Google Alert preview."
And in that "best business books for photographers" was the link.
That's kind of odd. I mean, rather than make "article" the link -- which is the logical word to link to -- you went for the description, a link that was keyword-rich.
I truly think it's more logical to link to "best business books for photographers" in almost every instance -- links are like boldface or italics in that they draw the eye, and using specific anchor text makes text more scannable. Using generic anchor text like "article" or "click here" makes the reader do a little more work. Yes, I know SEO's have a history of abusing anchor text and Google is pissed about it, but that doesn't mean we have to take Google's position that all anchor text links are evil. I still think they are more user-friendly. This is why Google used anchor text in the algorithm in the first place.
Exactly what Elisa said. It's unfair to read intent into a link, which is what Google does. "You linked it that way to game our system." Bull. If I were writing that, I would want people to scan the article and if they had an interest in photography, the link could stand out and they might click on it.
The alternative, I guess, is to link "article" and bold "best business books for photographers" -- but that really is thinking like an SEO instead of building your links for the user.
I have to agree with Elisa and Colby. That is the phrase I would have linked because it is what would tell my reader was at that link. Linking the word "article" gives them no clue why they should care about what is there.
It's totally logical and helpful to link to content in a way that describes it best for readers. In the example above, linking to article would have worked. Linking to, "my article, TITLE OF ARTICLE," would have worked, with the link around the title. Even what was done would have worked IF not for what probably seemed to Google as unnatural linking elsewhere.
I'm all for linking titles, but titles are often keyword-rich too.
You're right, Elisa - but that's because the purpose of a title is to describe what the reader will find when he gets into the article. Which is the same purpose anchor text is supposed to serve.
It disgusts me that Google is - intentionally or not - encouraging webmasters to reduce the usability of our sites in order to lessen the risk of some ill-thought-out penalty.
It's all about context and in this particular context, it looks spammy and unnatural linking the text 'best business books for photographers' to the article.
The context was to briefly mention a website in passing. I.e - [this article] about [this subject]. I agree with Danny, had the paragraph been edited accordingly (like Danny's example above), then there would have been additional scope to use a descriptive anchor but as it was, using the word 'article' as the anchor probably would have made the most sense.
Exactly the same scenario in how I just linked to Danny's comment here. It wouldn't have made sense using a different anchor than 'example' the passing term I am using to reference something.
Definitely working on it. It's a stressful thing for sure. It would be useful if Google said straight up "this is what you need to look at" but of course that is unlikely to happen.
Yes, agreed entirely there. You've been given some sample links, which is better than in the past, but if some human at Google had said "hey Scott, look at these five or ten things. they looked weird to me, this is why, this is what I thought was going on and that's why I gave you a penalty. but if you clean this stuff up, you'll be back." That would have been much more helpful.
Spot on and well stated, Danny. Nearly everything Google looks at seems to be framed in terms of thresholds and patterns, and in my opinion, Scott's site seems to have triggered the issue for the anchor text issue. And I completely agree that the link's origin on Moz had nothing to do with either the penalty or the sample provided.
I do quite a few link profile cleanups for clients and MANY webmasters take it personally when I suggest that we need a link from their site removed or nofollowed. Pointing out the thresholds and patterns issue tends to defuse a lot of their concern (as well as resulting in more cooperative responses).
Hey Scott, just a quick tip. If the photographers that linked to you in footers/sidebars have quality sites, just ask them to move the link to their "about page" or something similar, "recommended friends" etc.
Keep as many quality dofollow links as you can, because Goog is fairly stringent about making you cut your own leg off if they think your toenail has got some fungus.
I actually did make that suggestion, but so many are not technical and just used the simple Blogroll feature in WordPress to do it. I offered to help whoever needed it.
Hi Scott, just a note on point 3. Neil Patel wrote a really interesting article the other day on some of the lessons he learned doing content marketing for KISSmetrics and one of the issues he had was content scrappers, his solution to the problem was to block traffic coming from certain IP addresses which seems to have stopped the problem.
I've tried blocking in the past, but I found that blocking can also lead to further problems which are unexpected. So I don't block, however my host does monitor traffic for security so I'd imagine that a bunch are already blocked.
Rand, I was shocked -- but sadly, not surprised -- at seeing the "guest article" notice from Google about Moz.
I can speak from experience that Moz as a whole and the content and community teams specifically are very, very diligent. Personally, I've had some posts rejected, and the Moz team has taken their gracious time to work with me on improving other posts. (I'm sure others can say the same!) That's not a bad thing -- as I used to say in my old journalism career, even Woodward and Bernstein needed a good editor. A good, objective pair of eyes is always beneficial. And that's exactly what Moz does. I'm always thankful.
Now, of course, I don't know the behind-the-scenes details of what is going on. But perhaps I'd suggest one or both of these two options?
1. Contact Google and ensure that its webspam team stops marking links from Moz as (bad) "guest articles." (I'm sure Google must have a "white list" or something similar.)
2. If the above idea does not work, perhaps make either all links to author-affiliated websites "nofollow" or links to all external websites "nofollow." (A quick glance at the source code seems to confirm that this is not being done.)
In regards to the second option, I'm sure that Moz would get at least a little pushback from some parts of the community. But my response would be: If whether the links in your posts are followed or nofollowed is influencing your decisions on which sites to contribute, you're doing it wrong. In the end, it's not about the links -- it's about helping the community to which you are contributing and getting the brand exposure in general. Everything else -- traffic, links, whatever -- indirectly comes later as a result. I can say that any addition of nofollow would not discourage me to stop contributing in the slightest because followed links are not the point. (Who's with me?)
I'll repeat your posting of these excellent posts on that very topic because I wholeheartedly love them: Building Relationships, Not Links, Guest Blogging Isn't Dead, But Blogging Just for Links Is, Time for Guest Blogging With a Purpose, and (disclosure) a Moz post of mine on how to incorporate traditional PR strategy into your "guest blog posts" (or, more accurately, "by-lined articles").
Actually, I think the second above option may help Moz. If such links were "nofollowed," then I'm sure Moz would get far fewer submissions that are -- I'll be polite -- "of less than stellar quality." You guys and gals would be less swamped, I'm sure. :)
Edit: I forgot to add one point:
However, I can understand Google's point of view. It comes down to this statement: "Why should Google give you credit for a link that you gave yourself?"
No matter how diligent the publisher and how great the content, the fact remains that a link that, say, I put in a Moz post to my website -- in the content to reference source material, in the biographical blurb to give my background, or whatever -- is still a link that I gave myself. So, I can see the other side of the coin. For this reason, I'm leaning towards the "nofollow" option that I described above.
But obviously, it's Moz's call -- just wanted to throw out some thoughts!
Just a note about the noindex or not. Letting a site passing link equity is an editorial choice, therefore it seems absurd to me that Google is the one telling us what an editorial choice is.
Putting everything noindex is like saying that all the web documents we are linking out don't have a value, which is quite absurd.
And linking out using only brand names (knowing how Google is buggy in that sense), clean URLs or words like "this", "here", "click to see the article" is dumb: usability FTW!
About crediting your own content: is it relevant to the context? For instance it may happen I link out to another post of mine on my blog, or State of Digital because there I was talking about related topics... just because its mine I should no follow it? And at the same time I would be allowed to link out to another web document, with the same editorial value but simply because it's not mine?
Monkeys... we are scrutinized by monkeys
In most instances, trained monkeys do a better job than humans.
I don't understand how anyone can be shocked by this result. It's an exact-match anchor link to an unrelated website on a guest post. Easily marked as spam by anyone that has glanced at the Quality Guidelines.
Google is certainly no fair playing field, but that particular example is nothing to be shocked by.
Samuel - I think you're absolutely right. We could nofollow the links and we'd likely avoid the very rare occasions when Google makes this an issue. But, I believe that's fundamentally in violation of our core values (TAGFEE) and is, in fact, merely bending over backwards and going too far to accomodate Google.
Every link from YouMoz is editorially reviewed, just like every blog post. Keri's rubric (in her comment below) details the remarkably solid process our editors use on every submission. To me, that's fundamentally an editorial link - we chose to leave that link in because we believe it will be useful and helpful to our audience. Those are votes I want any search engine and any visitor to know that Moz endorsed. If those engines/visitors decide that because we aren't diligent enough in making those links valuable, then it's their right to ignore those links. But the transparent, honest, authentic thing to do here is to follow links that are editorially approved. That's how the web is supposed to be and it's the web I want to keep contributing to.
Make a stand:
User-agent: GoogleBot
Disallow: /
You first, let me know how it goes :)
isn't "bending over backwards [..] to accommodate Google" the cornerstone of SEO?!
I think that's an antiquated notion Tom. Google want to do their bit to make the web a better place and moreover they do a good job with it. Unfortunately we cannot please everyone all of the time. That's life and I suppose we could look at these 'scandals' as indirect negotiations between the greater (SEO) web community and Google's own emerging 'guidelines'. It's always going to be a moving feast.
I agree with this a ton - reading the comments initially scared me but using the same anchor text that is clearly not the brand is an issue. This is nothing new and no one should be surprised. Realistically he shouldn't have used the anchor text on the post and just put a link to the website/used the brand name. It was his risk and he clearly wasn't rewarded.
On your suggestions that you contact Google and let them know that Moz only allows high quality links. It's a great idea, but getting any kind of definite action from Google is really hard for big brands, and next to impossible for the average webmaster.
They are (perhaps correctly) not set up to handle thousands of individual requests for corrections. Unfortunately the current policies and algorithms don't really work without these, as they are still not smart enough to truly determine a links' intent.
Mildly frustrating, as Youmoz is really good for surfacing new tips and ideas. I keep quite a close eye out for new material on that part of the site.
We get a lot of false positives coming through recon messaging, all i can say is once an analyst has it in his / her head that a link needs removing, you're not going to win until it's gone.
IMO - the links included in the article itself are pretty editorial in nature (with the obvious caveat that, come on, we're all search marketers here and we know there's going to be an SEO benefit to linking out. Don't tell me that isn't at least part of the incentive to put a post on Moz.). Maybe it's the footer links. My footer's getting an edit right now...
Small inclusion: I wrote about the alternatives to linking here https://builtvisible.com/hard-fast-rule-guest-blogging/ (Irony of including a link fully acknowledged) :D
Google has received a reconsideration request from the site owner for: https://builtvisible.com
We've reviewed the links to your site and we still believe that some of them are outside our quality guidelines.
Sample URLs:
Please correct or remove all inorganic links, not limited to the samples provided above. After doing so, please sacrifice 1 goat, 7 chickens, accept C'thulu into your life and change your name to Trudy.
If there are still links to your site that cannot be removed, you can use disavow links tool. No honestly, it really works.
Your palz
Google.
LOL! Yeah, saw that and just thought, not this again!
"Inorganic"? This is absolutely shocking! Are Google really serious about policing their search index? Why would then they allow big brands to get away with "blatantly" unscrupulous SEO practices while browbeating innocent webmasters under the pretext of building a better Web?
Another example of Google's road plan of SERP manipulation.
This is simply hilarious. If YouMoz link was flagged in inappropriate or inorganic by Spam bots then Google seriously needs to look into their spam link check mechanism, or if it was manually done by Google's search quality experts, then they must be sleeping during the audit process. :D
Rand,
This is truly unfortunate that a high quality website like Moz, and a great article from Scott would get called out as being manipulative...
But let's look at it from an agnostic/theoretical view of the web. What if it's not 100% about the site the link is on? What if part of author rank (or an algo update targeting guest blogging) assigns a group of topics to the author, treating him/her like an entity?
Google might think, based on data points, that Scott is a photographer (looking at a bunch of his profiles is seems like that is his primary topic).
Scott = Photographer
Moz = Marketing (or some group of topics not about photography)
One of the triggers could be:
Scott publishing an article (link building) on a highly authoritative website that is off topic from his primary topic (photography), might trigger the link on said website to be defined as manipulative - even if the topic Scott wrote about was on-topic for the website.
So maybe it's not 100% about Moz.com (as a website) but about the author's overall participation in guest blogging (and what Google thinks the author should be a specialist in) that triggered moz.com to be included in the list of possible manipulation.
In my mind this brings up the question, "Do sites who accept guest posts have to start digging into the authors background and guest blogging activities?".
Might be something to test...
Very interesting thought.
I think this is a solid idea. Testing it may give us a bit more understanding on the topic.
I see what you're saying Bill, but this would mean Google incorrectly understands that I or any other author is passionate about more than one thing and that I would theoretically be relegated to write about the one thing I'm most passionate about. More importantly, if their author rank algorithm is this shallow, then it's a fundamental flaw in the author rank algorithm.
Though I may not always write about "fill-in-the-blank" secondary topic, doesn't mean I don't have something valid to say about that secondary topic. So, to use your example, if I am a photographer and I'm writing about marketing (because to be a successful photographer, I have to understand marketing), then that shouldn't mean the post I wrote about marketing automatically gets flagged as irrelevant or unrelated to the topics I'm allowed to write about.
Hey Brad -
Not saying it's that shallow (or simple), or that it's the only value metric, just saying that it would make some sense for Google to apply a topic value to users (much like websites earn value and authority through inbound links, citations, etc.).
For example, Rand probably has more topic value (in Google's eyes) when it comes to SEO or digital marketing, (based on all that he has written, and what others are saying about his value) than he does about cooking - since there probably not alot of cooking articles written by Rand that have been referenced on high-value websites (forgive me Rand if you are a top chef).
Nor am I saying that if Google gives an author a primary topic value, that said author can't provide value in other areas or topics - but I think that a user (much like a website), needs to earn that topic value by demonstrating their competency in said topic (in the eyes of Google).
As I mentioned, this is a little ridiculous that a site like Moz would be flagged in GWT, or a link in an article that has value would be flagged - as I have a ton of respect for Moz and Scott. Just trying to think outside the box and go deeper than "exact match anchor text" convos.
It's not quite that simple - the links are to Scott's site (which includes information about SEO and marketing and business books that have a lot of overlap with Moz's topics), and they're from pieces he authored where he references those pages in order to be valuable to readers. In my blog posts on Moz I link to video games, to Shakespearean festivals, to sites that offer pet boarding, to beard grooming boutiques, and more. Those citations are intentional, editorial, and should be followed links (because we editorially endorse them). The fact that they're not also about SEO is no reason to stop linking to them - if we begin to feel pressure from Google to never link to "off-topic" pages/sites, the web will become a less connected, less valuable, worse place. That's why it's so important for Google to get criticism rather than blind faith when they make mistakes like this one.
Understood Rand,
I agree, and would not recommend that we stop linking to other sites - on topic or off. Nor am I suggesting it's that simple :) - much like I don't believe that all links are either a 1 or 0, but rather that there is a varying scale for the amount of value a link will pass based on an algo (possibly based on the author's topic value and contextual article value).
Looking at topic relevancy from a different angle, what user value (on a scale of highly matching contextual intent - not matching contextual intent) do off topics links provide to accomplishing the task of learning about a topic? If a user wanted to learn about a topic, and came to an article with that specific intent, and there are links to off-topic websites within an article - could that provide a less favorable signal to Google (again not necessarily a 1 or 0) and provide little user value in terms of intent?
"In my blog posts on Moz I link to video games, to Shakespearean festivals, to sites that offer pet boarding, to beard grooming boutiques, and more."
But what if you owned or had a stake in those websites?
"Those citations are intentional, editorial, and should be followed links (because we editorially endorse them)."
I don't discount the validity of the links, but there needs to be a programmatic way of determining a difference between Rand doing it, and Spammer X doing it - a way to determine intent, as well as separate content for link building sake and content for value sake.
"...if we begin to feel pressure from Google to never link to "off-topic" pages/sites, the web will become a less connected, less valuable, worse place..."
Completely agree, these concerns need to be brought up to Google, and they [Google] should (whether they do or not) take them seriously, as all "machines" need to evolve and take in all types of data points.
This sensitivity to links is actually counter productive to the framework that Google is/was built on, and has created a risk of linking out that is hurting the flow of information.
Not trying to discount the fact that this is an issue, or justify Google's actions, just trying to look at it from a different angle that is outside of the traditional response.
We actually do have stakes sometimes, and we note them. I used to be on the board of directors at Minted, and hold shares in that company, yet I've linked to them here on Moz (and noted my relationship when I did). Those links aren't nofollowed because I also genuinely endorse the site (I wouldn't have joined their board if I didn't). I believe Google's responsibility is to decide which links/entities/sites to trust, and based on what Matt Cutts said to Jen, I think it's fairly clear that they've been happy to trust us in the past. The hundreds of millions of people who create links on the web cannot be asked to architect their links without thought of self-interest, but Google can ask their engineers to build algorithms that account for self-interest. One is unreasonable and easily gamed, the other is scalable and pragmatic.
Good example Rand. I think that the main point to drive home is "What is the intention behind the link." If I have a partner site and it makes sense to my readers for me to mention that site then it's fine for me to do so with a followed link. But, if I am doing this on a large scale, and if perhaps I am using keyword anchor text in an obvious attempt to manipulate my Google rankings then it becomes unnatural.
But Google is not super great at understanding intention. Neither are humans, honestly. This should be algorithmic, not a moral judgment call.
OMG, Once link from Dmoz got considered as unnatural and sent in the same manner to someone. I remembered than John Mueller from Google has stated that "That particular DMOZ/ODP link-example sounds like a mistake on our side."
It means it was all due to some mistake.
Here is the post -
https://plus.google.com/111696587332859976219/post...
Also see it -
https://www.seroundtable.com/google-dmoz-unnatural-...
I hope it will be the same case.
What do you think?
But as Google is quick to note "That particular DMOZ/ODP link-example sounds like a mistake on our side." it seems "mistakes" in reconsideration requests are happening more and more.
Let's see whether Google reacts on this post or not, because without Google's words we all can't say anything for sure.
At the end of the day, mistakes happen. Some Google engineer has had to manually include those links. Maybe s/he made a mistake: grabbed the wrong one, doesn't know better, naïvely thinks all links are bad... could even be his/her first day. This could all boil down to human error - especially given Matt's wishy-washy non-answer reply to Jen, which could be more a case of covering it up rather than facing the enbarrassment of admitting that it was a mistake...
Steve the thing is if your team spend 20 hours on link removals, link analysis, disavow ect and then Google do not even bother to check a sample of 5 links in under one minute of analysis what is fair?? Something is WRONG with the process. I agree mistakes happen and you give people a second chance, yet this is happening a fair amount and revenue is been lost for businesses. I have seen VC funds VERY angry over manual action on websites from work done in 2006! when it is now 2014!!!
I hear you, James. I had a client where we disavowed hundreds and hundreds of bad links and managed to remove a fair chunk of them, and yet the penalty wasn't lifted because we missed two. Not fair at all. It's the equivalent of saying "you missed a spot." It's petty... but hey, what else can we do? We're playing Google's game, and Google are the boss...
Agree Steve, it is Google's game of chess and SEO's are just the pawns moving around ;) I know the feeling =)
I absolutely agree with the human fallibility hypothesis. ( I do not have a conspiratorial view of the world.)
There are some interesting parallels here:
Facebook ads
What is acceptable one day is not acceptable on the second day, before becoming acceptable again on the third day. I compare notes with dozens of others working in my niche and we agree: it's comically absurd.
PR Web News Releases
You can go back and for hours. Then a new editor comes on duty and approves the latest version, I've always suspected the new editor can't be bothered to delve into the whole change history too deeply. He just sees a good faith effort to comply with guidelines.
Google ads
Not as bad as Facebook ads, but they have their random and arbitrary moments.
Grand conclusion? None really...beyond recognizing there are fallible human beings on the other side. So just keep plugging away.
And don't look for truth, virtue or justice. It will make you crazy!
Rightly said, it must be a mistake from their end. As, Matt Cutts is on leave his replacement is or may not be aware of the high authority site available on net.
I highly doubt Matt Cutt's would be checking Reconsideration requests, maybe only doing some analysis on VERY high level ones. It is evident that Google probably has a team some where in the world who deals with reconsideration's, they probably check 10000's a day.
I cant understand whether Google is fighting against Spam Or against SEO.
\What to save now!!! I am really afraid of having a blog that writes about SEO at any point...... :P :P :P
Absolutely ridiculous.
Here's an example I had - https://twitter.com/dan_shure/status/458656465104814080 and in my opinion it was a perfectly good link, and my client had earned it naturally.
Ok... I know this is an easy cheesy game, but if an anchor text here is considered spammy, what could be considered all this post https://matt.cutts.usesthis.com/?
Google is becoming really strict with guest posting, wow, unbelievable!
Although I mostly agree, looking from Google's perspective:
How are these links
I've always seen you preach the Google guidelines's way of doing SEO. So, no need to look thunderstruck, right?
If I write a post on Moz and in that post I refer to another post that I wrote in my own blog or somewhere else because it deepens or it's strongly related to what I'm writing, what should I do? Put a nofollow simply because it is mine?
In the context of the post that link was reasonably fine in terms of relevance.
And related to "organic/editorial", the guest posts are always reviewed, hence substantially "placed" by the site owner if they pass the quality review.
I agree that a good guest blog can link to whatever is relevant and adds value (even your own posts on a different domain). But how does that make the profile link to https://www.photographers-seo.com/ organic? That link is just there to link to one of the authors projects. If I was to judge, in Google's view that link is unnatural.
I see your point, but where do you draw the line? By that logic, no one should ever link out to their own websites - e.g. if I write a guest post, I can't link to my own blog or my work website. Which is utterly crazy and nonsensical, and goes against the whole purpose of the Web (IMO).
Once you use keyword rich anchors for linking to your site's from guest posts and bylines you're playing with fire!
I think the "bio" link is less offensive because it's not contextual. The deleted link to his photography site was fairly high up the page and completely off-topic.
Hey Sander, per a comment above, the penalty is not related to the Photographers SEO link or domain. It's about the link to his personal blog.
I don't think that matters that much. In the end a Google guy reviewed the link profile, found this blog post and noticed a couple of things:
I don't think Moz is to blame here, to be honest, I'm a bit surprised by this post. I think every link can be the one pushing the coin, whether or not the site or SEO is the one to blame. It's about the grand total.
What also disappoints me is the total surprise of some of the reactions. The past few years Googles stance against any form of unnatural links is getting more and more strict. Compare now against 2012: full link profile tipping the coin VS exact anchors only back in 2012. Google is moving forward (although not as fast as I wish, but that's an other story), so should we as SEO. Analyse competitors tactics, view what happens over time via a tool like Searchmetrics and try to connect the dots: Where is Google moving to? Am I doing the right thing (now and in the long run). If not, am I willing to take this risk? Log what you've done, so in case of a penalty you can score out the things that are less like to have caused the penalty.
The last thing you need in case of a penalty is to think like an out dated SEO. You'll only get more frustrated by every attempt to file a reconsideration request.
This sounds really strange. If such editorial links are getting mentions from Google, then we would be in goose chase. Google is trying to outplay search marketers.
Do you not think it's incredibly ironic that the subject of his post was "build relationships - not links"?
Oh, yes, you did. I've just read a little further.
I strongly dislike the random, clearly "bulk" emails I occasionally see from services that ask me to remove a link because "it was identified by Google as an unnatural link"... when it's one of the most natural editorial links a site could get. I have no idea if it's really specifically called out, or just lazy penalty removal.
Google needs to tighten this up. It's a nasty symptom.
Well, I see the YouMoz post that was written by Scott was back in October 2012. I'm not sure exactly when Scott connected his Google+ authorship, but ironic that Scott has connected his Google+ authorship to that post as well. It shows up correctly in Google's rich snippet testing tool and Google still flagged it. Granted, connecting authorship doesn't automatically mean that your guest article is not spam, but I'm thinking back to the Matt Cutts video where he talks about an author who has "sweat over" their post and "has a message" and "some perspective" that they'd like to get across and gets the post on a trusted site (like this one).
When dealing with sites that come to us with a manual link warning/penalty we've now found that NO link/site is "safe". We've had client's who are listed on authority accreditation sites, for example, with non-anchor text links and those very links still show up as those that need to be removed. In fact, we've found that in many cases when dealing with an actual manual link penalty we have to remove (or disavow) 100% of the found links - regardless of hosting site, style of link or age. It's pretty ridiculous actually and makes absolutely no sense. But, that is the only way we have been able to get the penalties removed. This infuriates certain clients who have historically received direct clicks from said links and wanted to maintain them for that reason - but we've found no work around for it. Even NON-indexable links have to be removed.
Has anyone else seen the same thing?
The reason I believe is the guest blogger is Google's eye is a photographer, and here talking about link building, which is a sensative topic for Google, perhaps Google is not happy to hear people talking about SEO, link building. This is not quite reasonable or convincing, but you have nobody to argue with, but be more careful!
That's a valid thought, but if Google is already looking at everything I do, and knows I'm a photographer then they also know I spend many of my days helping photographers with their SEO at the Photographer's SEO Community.
Thank you for your reply , Scott. Perhaps something you have done on other sites has lead to this manual action and Google suspect your link on Moz together with others. You can later share your experience on dealing with this manual action. Followed your on Google Plus! Thanks
Since this is an informative site and some of us are paid pro members, why not just have all links here (including YouMoz) be nofollow links? That would solve everything. If this article had 'we had this issue and here's what we did to fix it and you can too' context that would be different. Your solution on your site set, as you allow comments, is to make all links nofollow.
Hey Lee - see my conversation with EGOL and some of the other comments I've left here. Basically, I don't think we should abuse nofollow in this way. Moz does indeed editorially review and editorially endorse content left by our contributors. Using nofollow on those links would actually be a violation of its intent purely out of overreactive fear to Google. As Jen's conversation with Matt Cutts (in the post above) indicates, even he feels there's a good editorial quality bar on Moz. I'd much prefer to keep the bar high, the spam out, and the links natural.
Fair enough. It appears if Google isn't always following their own guidelines. It could be a tweak in the algorithm or some human error here and there. I don't believe in gaming the system either, but sometimes it is what it is.
Google has the search monopoly and for that reason we must ensure that they do not simply get their way, 'just because they can'. Both Google and SEOs are going to be pulling on the ropes for as long as the search service exists; sometimes in unison other times, like this one, not. I think it's the duty of SEOs to debate these issues and make sure that Google understands that we expect them to 'negotiate' with us in the interests of a better web.
We can't always expect them to get it right, but we certainly should expect them to be clear cut about what is or is not considered spam. As Danny Sullivan suggests, I too believe it is a matter of better communication... which Google have improved over time. We just need to keep pushing for it.
I think we are talking about the internet, the internet is Links - links from one page to another. Isn't that how it works? You write something about a page and link to it - followed, because that is the internet.
Its unnatural to link out only nofollow.
I use nofollow only for Blog Comments (name Link) and when I write about a page I want to dissociate.
That's what Econsultancy did (a big digital marketing resource in the UK). But then you could argue... Would Google see that as an attempt at PageRank scultping (i.e. trying to retain PageRank and SEO 'juice' through the site by stopping it flowing out to external links)? I'm sure that Matt Cutts said ages ago that Google ignores attempts at PageRank sculpting, but if Google were to think that Econsultancy were doing it on purpose in an attempt to benefit themselves, they could get in trouble for it (even though - ironically - they're doing it to stay in Google's good books)...
Damned if you do, damned if you don't.
I defenetly know more questionable links. I didn't read a post here wich made me think "there was someone who just wanted a link" or "thats not helpful for this community".
Google unnatural link penalties and the (necessarily!) painful and tedious reconsideration request process to turn-around these penalties is primarily abut one thing:
Intent.
As someone that's received the dreaded GWMT message, has done the hard yards, said sorry (quite literally), had the penalty lifted, and felt the reward of redemption, I urge anyone who'd faced with an unnatural link penalty, not to get bogged down in detail (in this case, the link examples provided by Google).
Whilst we should support Rand, Marie and others' wishes for things like clearer messaging and more personally attention from webspam team members, it's impractical and idealistic to focus on these issues when you're facing a unnatural link penalty and seek a rapid recovery.
To recover from penalties, requires a metric tonne of hard work and serious introspection:
Technical introspection (thorough backlink analysis), and
Personal introspection (be honest with yourself -- really -- about your original intent)
Anything else -- like ranting about the robot that's brought your online business/community/website to where it is today -- is the path to a long and slow recovery.
So I'll say it a again...
Intent. Intent. Intent.
Hey Rand,
I can't digest this action by Google. As,Google in ranking MOZ at almost every SEO related keyword because of their powerful presence and the trusted name in industry. What's the point of this violation then?
We all know that how strictly Moz review the content and what's the scale of their quality. I think, MOZ people should dig this case more and let the whole community know what makes google to issue this warning?
Can I conclude that getting the link from trusted websites is wrong? (I doubt it.)
Looking for the detailed responses.
I can see both sites of the argument, the link text could be seen as spammy as part of a larger profile, but I fully agree that webmasters should be able to editorially add and approve any links they want.
Hi Moz Team,
Thank you for sharing all the information about your quality standards used by Moz to funnel down the submitted articles. Read Scott's article too. Can't find a probable reason for Google to mark as inorganic.
I agree with Rand's idea of dofollow links. If providing a dofollow link would have been an inorganic thing it should have been banned long time ago. Link Building is a healthy thing when the data flow is good and it should be encouraged (as Moz does). Providing do follow link helps encourage people to work more towards getting data that is based on research and analysis and thus helps explore wider areas.
Do you think sometimes it could be as simple as having "SEO" or "links" in the link/anchor text/domain?
Maybe there are a list of red flag words and phrases that can just trip you up sometimes. I know it sounds crude, but there must be some sort of automation going on in generating these GWT warnings, and like everything it will take time for this to become more sophisticated. Google talks about "problematic sites" and I am sure a lot of these sites use these words and phrases a lot.
I'm sure this is way too basic though.
Martin,
I second your thought here. We should divert our attention towards this particular case and gives our recommendations rather to mourn about it.
Hi Martin - Google looks for patterns. If there is a pattern of spam associated with certain url strings, you can bet the spam team has a way of tracking it.
Great topic, really enjoyed the debates in comments. I really hate the way Google polices the internet, i get it I do it just doesn't seem to sit well.
I think photographer and seo both are different category that's why Google consider this spam but personally I know YouMoz has 92 domain Authority out of 100 so there should not any other region for spamming. Thank You.
As an owner of a small business with a website who sells online i feel very frustrated at what is happening, and it seems i am not alone, it is like playing a game with ever changing rules, and the one who changes the rules have no accountability because he became Very big, and no one dares to challenge him.
As rand said, if they dont want to count the link, its ok, its their policy not to count it, but WHY do they want to penalize the website, it is not true by anyway that you can control inbound links , many practices online focus on negative SEO where they spam a particular website, and then the webmaster should disavow all these links, to get out of penalty.
Yes !! Moz Don't Violate Google Quality Guidelines..i dont know why google doing like this...i hope Google will apologize for such action :) :)
Recently, I noticed one of the reconsideration request replies and it contained a link from a top SEO news site slightly similar to MOZ. Shockingly, there seems to be no end at how wrongly Google is determining the link profile of different sites.
It's not the source of the link that is the problem...it's the overall pattern. If I have been building keyword anchor texted links all across the web and 99% of those are on spammy sites and 1% are on high authority sites (that somehow let the link get in), even those 1% are unnatural links. Google is not saying that links from Moz are bad. They're saying that a pattern of manipulative linking...no matter what the link source is...is bad.
The bottom line is that, Google needs to be transparent.
Google is nothing but recommender system, that recommends items (websites) with respect to customer queries (searches), in an ordered manner as they publicly disclose ("order" = search engine rankings).
Some of them are done against financial incentives (ads) and are clearly disclosed, which is a good business practice.
Some others are not done (organic results), and the complete absence of transparency makes it a terrible business practice.
As a consumer of Google, going by business transparency requirements, I would suppose one is entitled to receive clear reasoning why some items are being recommended above others. That is essential for eliminating shady business practices. The question has absolutely nothing to do whether Google is being shady. In all likelihood it is not. At the same time, that does not excuse them from being officially transparent. That holds true for any business (and I would like to believe that most businesses in this world are also clean, just like Google).
The same ought to hold true for those being listed also.
For example, why am I not "professionally harmed and bad-mouthed" when my product, the one that I have created with my very best (whatever the product is), is rated below someone else's (which is fine on its own) by a third party (Google), but without any explanation why (which is not fine at all)? To me, in absence of transparency, all of this could be Google's whim! Doesn't that bring them under serious consumer right questions? That ought to!
In other words, unless Google is providing me with sufficient transparency into why they rate my business lower than others and publicly disclose that (to their users/searchers), then are doing something absolutely unethical. Aren't they?
Google talks about spammers gaming their system if they disclose. So, do they claim they are vulnerable?
As an example, if they claim that spammers are going to upset their content match quality, will they not do it just by creating content better than existing ones? If so, will that so-called spam remain a "spam" any more? We just now said, Google found a page that has "better" content (and Google is meant to do text processing) which is spam. Come on, then Wikipedia has terrific content and hence is spam!
As another example, if they claim that spammers are going to upset their algo by building "bad" backlinks, then, what's a bad backlink? Something coming from Wikipedia? If Wikipedia links to it, the site is probably good. If the site X is being linked from "sufficiently large number of sufficiently bad" sites {Y} then the site is bad. But then, the linking sites y in Y are known to be bad, right? Spammers cannot mask a bad site into good, can they? Obviously not. If they can do enough to make the bad site good enough, then again, we have a conflict - we are talking about a good site becoming good.
Personally, I am delighted that the Europe Anti-Trusts against Google is happening. Google has been high-handed in bad-mouthing any business (web page) they want (by ranking them lower than others, except the #1 ranked site for the given keyword) every possible time, without giving a reason.
Rand, you may like it or not, but mostly, till being hurt by Google like this (as on this post), you had not realized I suppose that so many voices like us on this earth are not all foul. How about closing your eyes for a moment and questioning Google's fairness? I am not talking about the algo fairness. I am sure their algo is fair. I have trust in my Computer Scientist friends in Google. I have many computer scientist friends in Google. I have my computer scientist students in Google, I am sure (never bothered to check). I have people in Google that have been absorbed by them based on my recommendations on their computer science skills - they need 2-3 recommendations from technically well-respected people to absorb relatively younger computer scientists (yes, in the world of computer science and information technology, in the real world, I am respected enough - I use a pen name online and I suppose I have been public about that many time on multiple places). And I know I have a huge point in what I am saying.
In my opinion, black hat SEO does not arise in spite of Google, it arises because of Google and their non-transparency.
And by the way, any link built by anyone on earth is black hat, it the owner of the linked website provides any incentive in building the link. That's absurd for algorithms to find. Google, for example, cannot monitor Visa and Amex for credit card money exchange, and Verizon and AT&T and Skype and The Real World (Face to Face) to find and understand the conversation that happened before building the link.
(Continuing the previous post)
So wake up. Let us agree to the following, and let's call a spade a spade directly on Google's face. Strangely, I have noticed in other fields, the couple of times that I have tried to do that, it has come back to haunt, invariably. Let's avoid that. So as I was saying above, let's agree to the following.
(i) Any link one builds for ANY client is artificial. By Google's definition, any link built with an incentive by the stakeholders of the site to build a link is black hat. Hence, anything link-building about SEO is black hat. I don't believe anything "very black" or "gray" really exists by Google's definition as they put it, and anything that anyone does outside it is bending the announced playground. I have never bothered to check Moz enough till now to understand whether they do offer link building services. By I do know they recommend link building services (ex: https://moz.com/blog/4-valuable-link-building-services). In othe words, they do recommend going black hat.
(ii) Google is not transparent in what they do. That's unethical in my opinion. Because, they end up recommending products (our websites) to consumers (searchers) without showing a reason publicly (that's publicly defaming items for no given reason - "no reason" would be easy to circumvent by giving a reason - their algorithm features/signals - but they don't). Google may have a very well-defined and precise reason why they rank some things over others (they do, I am sure) and they may have all the good intent on earth (I personally think they do have a great intent, really, and I am being sincere in this statement), but at the same time, that does not excuse them from being non-transparent (and hence unethical) to their consumers as well as businesses they recommend / do not. That's true for anyone in the business of making recommendations in general, not just for Google.
To add to my other comments made yesterday, I want to add another question. I find it ironical and funny.
This article happened (was written) because it accidentally came to the light that a Moz backlink was taken by Google as a link that violates their quality guidelines.
The list given by Google was a very short sample among many such links, which were remaining after a round of link cleansing by the site owner. (Note, this was the outcome of a reconsideration request, not the original penalization of Google to this site, which would have a LOT more bad links).
So, what makes us confident that this was the first/only time Google treated a Moz link as bad?
I would suspect this has been happening for quite some time! Rand, do you know otherwise?
Ironically, as the tiny set of sample URLs provided by Google would only in rare cases have the Moz link, it was probably never caught. Clearly, this is not a case of something not being reported by user in spite of knowing, it is something not reported by Google in spite of knowing, because it cares to report only a tiny fraction of what it sees (salute the awesome transparency that Google acts with)!
Speaking from my understanding of fundamentals of probabilities, the probability of this having happened is much higher than this not having happened.
And the probability is significantly high that, Moz has been thumping its chest for being clean in Google's eyes for long, while it was not so. To work this out, think of conditional probabilities and Bayes rule, and combine that with the ground reality of how many people do end up trying to clean up their sites from Google mess, and how many URLs are given out of the large sample. You will easily realize that the probability of your site showing up as a bad in a reconsideration list is considerably lower than the probability of your site being marked as bad.
That, of course, raises the question - has Moz been considered bad for long, and it has remained unknown? Probability says, yes.
Oh, the irony!
(By the way, I love Moz. I personally think it is one of the great sites around. The respect will increase if, now that they have finally seen evidence, they take their stance that clearly represents the evidence and be vocal about the fact that Google has not yet achieved in SEO what they dream to achieve, and their SEO guideline is what they dream to achieve rather than what they have achieved. I hope you guys got what I meant by the probability of Google highlighting your stuff as bad being low.
In general I love Google also often, but not always. For example, not for what they do with SEO, with being non-transparent and all that, not for what they do with Adwords and not for what they do with Adsense. But generally, I have a lot of respect for Google also. The technical talent they acquire is tremendous. They have some cool things such as Glass, Android and so on. They are the only private company that dare to think till Mars. They treat their employees really well. There are many more reasons that I love them, these are only a few.)
Rand,
Thanks for the post regarding this issue, and to let other Mozzer's know what is going on.
I understand the frustration behind this completely. We have seen a lot of movement after the Pando 4.0 update across all our keywords. Truthfully, the top ranking sites for one of our most competitive keywords are awful. We did a link check for the domain, to try and get a better idea of what was helping them along and were shocked at the results. Among the sites linking back were Korean receipe blogs, (with thousands of comments all with followed links), porn sites, guest blog posts on sites with 0 page rank, hundreds of links per page with exact match anchor text, etc etc etc and many many others. The number one ranking site's meta description has alt text showing up from images on the homepage in Googles indexed pages, lol. Its just awful, yet Google has decided to place this as the number one site for an SEO-related keyword.
The number two competitor has over 100 links pointing back to his site from Quicksprout, all exact match anchor text from posting in the comments section of the blog, usually within the same post many times over. After seeing this type of activity, it makes it difficult to understand just what Google has going on. I understand they are trying to combat spam, while putting more authoritative sites higher, but how can sites with linking patterns that directly violate the guidelines that Google has published (thin content, exact match links on questionable domains, etc) continue to rank so high? We have links from Moz, SEL, many highly rated PR sites, more citation links and profiles, a ton of content related to guiding people through SEO, small business tips, but yet these sites continue to out-position us for very competitive keywords.
In relation to the article, thank you for posting this. It does shed some light on what to look for in the "modern" Google for troublesome issues. Its also nice to see Moz respond directly to something like this, and make it public. Most sites wouldn't even have the Kahoonas to post that something like this happened with a link from their site. Great post sir.
I can understand the logic here, but it really makes you wonder what SEO will be like in 10 years.
If Google was a government, it would be pre cold war Russia or in the early years of the Chinese communist party. They are taking a position more and more that everyone is guilty, which is sad because a massive portion of webmasters and small business owners out there do not have the technical knowledge, time and / or resources to rectify these penalties.
If Google really valued their audience, they would embrace the SEO community and empower us with the tools to help people, not hinder us with inaccurate documentation, double speak and a mountain of misinformation at every turn. Aaron Wall and countless other SEOs have been saying it for years - this command and control strategy only benefits Google's bottom line.
The harder it is to rank, the more people are forced into AdWords and the more money Google makes. it's been proven again and again - which is fine. They are in the business of making money, but don't present this culture of transparency and "don't be evil" if your practices are anything but.
I just decided that stating something contrary to what members want to hear isn't a good idea.
They essentially want people to turn to adwords for increased traffic and revenue. You don't get to be a $40 billion a year company by playing fair and telling people all your secrets. For the most part, Google does give out a lot of information to guide people in the right direction, but it is frusterating when sites and webmasters that follow that to the best of their ability get hammered for mis-interpreting/not implementing it correctly.
I run into same problem several months ago. I don't know what is going on with Google's policies but I'm hoping they will fix all those mistakes since they are world's leading search engine
In my opinion, it is not the problem with Google. It might be a human mistake because quality search raters have set guidelines to differentiate between spammy websites on different set of rules. It is most of the time your due diligence to carefully mark each website.
Now what I see here (I haven't checked the links for the the site in question), people in Google ran a check on links and they might have found a couple of similar links with bad use of the anchor text. When the user submitted the reconsideration request they took all the "guest posts" and marked them as unnatural link source to manipulate rankings or page rank.
If the above scenario is not true, then I think the person who reviewed the penalty reconsideration request at Google have thought that what "photography" related bio and link has to do here with the website (Moz).
If both the above cases are not true, then there is definitely an issue with the overall link profile of the website because if the Moz had to do something here, all of the people who had links from Moz might be getting a notice (or they must have a balanced link profile).
Great post Rand. I enjoyed the post not because of its helpful content but for fruitful conversation between you and Matt. The controversy created here for the post Build relationships not link also get many views from here.
Thanks Matt.
The conversation was between Jen Lopez and Matt Cutts.
Thanks for the post. The only thing which is the fact that Google is not allowing any one to take back links on their money keywords in spite of the fact that the anchor used were descriptive for the users. Its the Google's Game and one has to must abide by rules to play the game safely.
Thanks
if this is a mistake by the google manual actions team affecting a domain with the authority of moz.com then..
What can we expect the rest of us?
Who says that if this is a mistake?
Why Google will not go wrong with other domains and manual actions?
Hi Cristian: This has NOTHING to do with Moz on whole - it is about a specific exact match anchor text link placed on a page of user generated content. This is a common spamming practice, usually associated with web 2.0 sites.
There is a certain arrogance at work when one private enterprise (Google) begins telling another private enterprise (Moz) that it's links are inappropriate. At that moment, Google becomes the self appointed head of a police state.
As soon as we start using their (Gogles') moral categories of "right, wrong and penalties", to describe the free and independent actions of web users, we know we have entered in to an entirely different ball game.
David, but there is the rule that if you want to rank highly in a private company's search results, then you have to play by that company's rules. It's their game; they get to make the rules. For better or worse!
But rules must have a meaning and all rules have reasonable exceptions.
All this is reminding me that Woody Allen movie were a south-American dictator decided that form that day the official language would have been the Swedish and all the citizens should have wear their pants over their trousers: if not, prison
Rules? They're not rules, they're "guidelines"!
Guidelines and rules are synonyms in the Google dictionary :)
Right On!
I agree--it reminds me of a comment from Duane Forrester that none of his sites have ever received a penalty and the reason being he always focused on quality content that spoke to his target audience's needs.
Now this issue is probably more complex than that but at the end of the day, Google gets to set the rules for us to play in their sandbox.
I think point is very much clear, Google is just trying make the search better n better. Ultimately it will be beneficial for all the user who uses Google. Initially when i started reading the article, i thought its completely ridiculous but later i realized its fair enough.
But the article is very effective written, such articles are always gives positive energy...
The only thing that comes to mind is that Google's algorithm didn't recognize the link as Scott's official brand name but simply as a rich anchor text on a BLOG. This isn't a manual review.
I don't know how long ago this message appeared on Scott's GWMT but what’s sad here is that the same algorithm didn't recognize it as an editorial link.
I can say from my personal point of view that when we get editorial links we take it for granted that Google would be more than fine with them, but this case proves that it’s not as simple as we thought it is.
We know that they didn't do it on purpose and we all understand the problem that does exist with anchor text links in blogs but still -- as SEOs we would just expect for more from Google in 2014.
I'll be very happy to see what happens next after this post.
I've been dealing with the penalty for a few months now, but the Moz link only came from Google a couple weeks ago.
I really do not know what Google is playing at here, it's not as if most people and businesses have enough on their plate without having to deal with little annoying things like this. The consequences for SEOs is more profound, so I hope Google do the right thing and come out publicly with a response and perhaps some better guidance.
Looks like google doesnt link any anchor text link from a method that the owner of the site built vs earned regardless of quality of particular site and review proccess.
Wow the business books for photographers is just descriptive anchor text leading back to an informative page as far as I can see. It's a shame it's come to this. On the other hand the example of The Photographers SEO Community is on more shaky ground - on the page we've got the brand referred to as Photographers SEO Book, the photographers seo book and community, as well as Photographers seo community in the page title. Although there are 4 instances of The Photographers SEO Community in the code, there's not one mention of it in the copy on the page - I'd do something about that just in case. Maybe it's time to get a stricter policy about the links blog posts and select which ones will be followed and which ones will be nofollowed.
If you read Matt Cutt's reply to Jen 'Short of that, keyword rich anchortext is higher risk than navigational anchortext like a person or site's name, and so on.'
The specific example to Scott Wyden webmaster tools link quality violation refers to the fact that the post referred to is using keyword rich anchor text 'Photography SEO community', which is also Scott's URL. So I'm guessing Google quality guideline is not referring to adding the editorial link back to Scott's website; but actually issue is the anchor text used.
This states a future link guidance that keyword rich URL's should not be used as anchor text when linking back to your website.
Note: I'm not sure whether Scott will be getting another link violation email from Google as Rand has used the same 'Anchor Text' 'Photography SEO community' within this post.
That anchor text - as you say - is the name itself of the community :-)... what should have he needed to do? Using this kind of anchor text: Check my community, which name is Photography SEO Community ?
This does raise an interesting debate about whether or not brands can be spammy by default. Google has targeted EMDs in recent years, and may be extending this to keyword dense domains...
The only thing is that the webmaster tools account was for my personal domain, not for the community. In fact, the community isn't even owned by me or under my webmaster tools account at all.
We've already seen the very thing. I performed an audit on one of the first sites that came to us with a link penalty and I discovered that although their actual links (locations/hosting sites) seemed legit because their brand name (website URL name) was actually a keyword in their space their entire link portfolio appeared to be over-optimized for anchor text. We were able to overcome but unfortunately any future linking would have to avoid their brand/URL name UNLESS it was used as www._______.com.
Lame.
These link penalties are really starting to get crazy. The thing is it is really hard to nail down what is good and what is bad after a certain point when many links are created automatically by scraper sites and algorithms. We recently figured out we had been blasted with a bunch of "live sex" anchor text links in September that sunk our rankings almost immediately.
LOL Mighty Google gives and also takes something back... I totally see Google's point but I also see one big monopol doing whatever they want to maximize THEIR (not ours) profit (for example by stealing content from our pages and putting it in on top of SERP).
So we can't steal somebody else content because it's violation and they simply can...
The community has been through all of these arguments before when this happened in the past. Always proves to be great for branding, interaction, and link bait efforts though.
The Web = The Ocean
Google = The White Whale
SEOs = Captain Ahab
As for me, I'm just Ishmael hanging on to Queequeg's coffin.
Google mention Youmoz link not because Youmoz domain is bad. They mention that link because scott website link looks suspicious on anchor “Photographers SEO Community”, His website link profile is fully optimized for that anchor text and also for “seo for photographers”. So the conclusion is simple Youmoz is not bad but his link is bad because it’s against Google guidelines.
Please read my comments above. Photographer's SEO Community is not a factor in this at all. The issue at hand was the "business books for photographers" link as that is the domain in my webmaster tools and that is the domain that was penalized.
Ops; My Bad Scott; I apologize for this
Lots of interesting thoughts in the comments. I couldn't help but think as I read them though that it is things like this that we normally turn to Matt Cutts for - who is on sabbatical right now. There needs to be a "backup" Matt Cutts when he is out - someone who we all become as familiar with through twitter / conferences, etc. - so that we have someone (rather than a team of people that we don't really know as well) to call on for things like this. :)
Matt is just one cog in a VERY large machine. He pulls a lot of weight and he is the face of Google to the SEO community, but his short absence makes zero difference, What is happening now at Google was planned out months ago, as is the case with any large corporation. Google's John Mueller is well known to the SEO community and available for comment should he choose to do so. (Unlikely)
Sometimes I feel that Google came to the point where they are loosing a ground. Search engines right now are filled up with information and there's a little place for new sources to appear. Google can't change the whole design and not hurt the overall UX, to present a larger portions of data. And why should they? There's a simple way to stay above the fold - money.
Google is a private advertising company and they can do whatever they want with their product, but... they are making money on someone else's work and that's the major issue. Unfortunately, we - webmasters let them set the standards.
I'm hoping Google was fair and penalized themselves as well! I noticed a link to a Google search results page in the article.
Case of "keyword rich anchor text link". Scott website's was link back from "New Jersey Photographer" keyword. This is not violation of Google's Webmaster Guidelines?
Mr. Wolverine i think is to early to react in such harsh and hot manner !!!
Google made the same mistake all the time. Fortunately it is reversible ... I've always found it a little abused how Google sorts our links. In short, he is the master ...All's well that ends well.
I'm really not sure what you're trying to say here. Can you please clarify?
Rand, you're not even talking about the correct link. It was the "New Jersey Photographer" link. Which has nothing to do with the article and absolutely does violate Google's guidelines by using keyword-rich anchor text, intentionally placed there by the site owner in order to manipulate the algorithm....in the author bio of a guest post. This is spam.
There are lots of red flags here that would catch the eye of a Googler
First, is the url:
https://moz.com/ugc/category/link-building
When Google sees both "UGC" (User generated content) and "Links" in the url, it is begging for a review. A High percentage of urls including these words are spam
Next, is bragging about your performance on Google. "my article is showing up second in the Google Alert preview" Whether its right or wrong, fair or unfair, I've seen a number of instances where Google spanks a website after promoting how it is positioned
The icing on the cake is the exact match anchor text link "best business books for photographers"
Having performed over 200 Link audits, I would have categorized this as a manipulative link due to the exact match and asked for removal. In my view, this link does not conform to Webmaster guidelines and serves as a legit example.
I love the work that Moz does and i think its one of the best resources on the web, but this one "slipped thru" the editorial review.
I want to push back against the "exact match text link" part and think about the user. People don't always (or almost never?) read articles long-form. They scan through. If someone is scanning through an article I want my link and the link's intent to jump out. This link passes that test. I know what I'm getting into and why. Using the AT "article" doesn't pass that test.
Colby - you may want to familiarize yourself with what Google classifies as a link scheme:
https://support.google.com/webmasters/answer/66356... You can push back all you want, but that doesn't change the fact that using exact match anchor text is one of the most common manipulative practices
A think a good example of what not do accordingly to Google can be this: https://matt.cutts.usesthis.com/
Don't you agree Chuck?
I wanted to write a post about that, but haven't had time. It's ok to guest blog and link to yourself if the links are ones that users would want to click on. BUT, if the majority of links pointing to your site are from self made guest posts then there is a problem. Those links in that guest post were not created so that Matt could boost his search engine rankings.
Hi Marie - i have to disagree with you on that one. As Danny Sullivan points out above, if the link wasn't meant to be manipulative, he would have linked the word "article" instead of "best business books for photographers". More importantly, Google thinks the EMAT link was dropped so Matt could boost his search engine rankings.
Hi Gianluca - I would agree that your example is a textbook case of what Google would consider link spam - nice one:)
i don't think its a red flag ...
In my opinion, the exact-match anchor link Scott put in the post could easily be seen as guest post spam. It's not fair, but here's why:
https://internetfolks.com/dear-moz-anything-can-violate-googles-quality-guidelines/
Just left you a comment, Tom.
Thanks, Scott. I replied on the post.
Any person in Web / Digital marketing would love to Contribute and get link from YouMoz or MOZ. What to do when Google gives such sort of notifications or warnings to webmasters. This is really difficult to understand or digest!
I just decided that stating something contrary to what members want to hear isn't a good idea.
According to Google's Webmaster tools I have over 1,100 links to my blog. I'd say 90+ percent are from sites I don't know and never requested. I took that as a compliment, but it may be a problem? I'm responsible for policing them?
Why all the thumbs down? That is what Google expects - that every site owner know how to police every link that points at their site.