First off, let me just say that there are a lot of people smarter and more experienced in scalably attacking web spam than I am working in the Search Quality division at Google and specifically on the Spam team. However, as a search enthusiast, a Google fan and an SEO, it seems to me that, all due respect, they're getting played - hard.
Word is, the Spam team's key personnel had some time off working on other projects and supposedly they're coming back together to renew the fight. I hope that's the case because the uproar about black/gray hat SEO gaming the results is worse than ever, and deservedly so. It's getting bad enough to where I actually worry that early adopters might stop using Google for commercial queries and start looking for alternatives because of how manipulative the top results feel. That behavior often trickles down over time.
Thus, I'm going to expound a bit on a tactic I discussed in my interview with Aaron for fighting what I see as a large part of the manipulation of results in Google - the abuse of anchor text rich links.
The basic problem is that if you want to rank well in Google for a high value, commercial search query like discount printer cartridges or home security camera systems, getting links with that anchor text containing those words, preferrably exact matches, is invaluable to rankings. Unfortunately, natural, editorially given links are extremely unlikely to use anchor text like that. They're more likely to use the business or website name, possibly a single relevant word or two, but finding dozens or hundreds of domains that will link with this kind of anchor text without push-marketing intervention from an SEO is next to impossible.
That means sites that earn the natural, editorial links fall behind, while those who find ways to grab the anchor text match links and evade Google's spam detection systems nab those top spots. It's been going on for 10 years like this, and it's insane. It needs to stop. Just as Google's said they'll be taking a hard look at exact match domain names, they need to take a hard look at precise matches for commercial anchor text links.
Here's the methodology I like:
Step 1: Create a list of oft-spammed, commercially-directed anchor text. With Google's resoures, this won't be hard at all. In fact, a good starting point might be something some top adsense keywords lists (this one was readily available).
_
Just a sample of some of the 3,400+ phrases in one file I found
_
I suspect Google's Webspam team would have no trouble compiling hundreds of thousands of phrases like this that have a high potential for gaming and are found in large quantities of anchor text links.
Step 2: Locate any page on the web containing 3+ links with any of these anchor text phrases linking to different sites. An obvious example might look something like this:
But, any collection of exact-match anchor, followed links to pages on multiple domains could be flagged by the system.
Step 3: Have manual spam raters spot check through a significant sample size of the pages flagged by this filtration process (maybe 5-10,00) and record the false positives (pages where Google would, legitimately want to count those links).
Step 4: If the false positives follow some easily identifiable pattern, write code to exclude them and their ilk from the filtration system. If the pattern is tougher to detect, machine learning could be applied to the sample, running across the positives and false positives to identify features that give an accurate algorithmic method for filtration.
Step 5: Devalue the manipulative links by applying the equivalent of a rel="nofollow" on them behind the scenes.
Step 6: Create a notification in Webmaster Tools saying "we've identified potentially manipulative links on pages on your site and have removed the value these links pass." Add this notification to 60-75% of the sites engaged in this activity AND write a blog post saying "we've applied this to 65% of the sites we've found engaging in this activity." If webmasters send re-consideration requests that they believe the filter caught false positives, you can send these back through Step 4 for evaluation and refinement.
Step 7: Create a flag in the PageRank toolbar for these same 60-75%, making the PR bar appear red on all the pages of the site. Announce this on the Webmaster Blog as well, noting that "65% of the sites we know about have been flagged with this."
_
That's gonna scare a lot of webmasters
Step 8: Watch as search quality improves from the algorithmic filtration of manipulative link power and less spam is created as link buyers and spammers realize their efforts are going to waste.
Is this plan foolproof? No. Are there loopholes and messiness and ways clever spammers will work around it? Absolutely. But the folks I've talked to about this agree that for a huge quantity of the most "obvious" webspam via link manipulation, this could have a big, direct, fast and scalable impact. The addition of steps 6 and 7 would also send a much needed message that site owners and content creators would hear and feel loud and clear, while creating enough uncertainty about the value of the non "marked" sites to cause a behavioral shift.
Maybe Google's already thought of this and shot it down, maybe they've already implemented it and we just think all those anchor text rich links are helping, but maybe, this thing has legs, and if it does, I hope Google does something. I'm bombarded so often with the question of "isn't Google irrelevant now?" and "hasn't SEO ruined Google?" that I'm fairly certain action's needed. This type of manipulation seems to me the most obvious, most painful and most addressable.
Looking forward to your comments, suggestions and ideas - undoubtedly my concept is riddled with holes, but perhaps with your help, we can patch it up.
p.s. Yes, conceptually we could create a metric like this with Linkscape and show it in the mozBar and via Open Site Explorer and/or the Web App, but I'm not sure how accurate we could be, nor do I think it's the best way to help web marketers through software (given our dozens of priorities). However, the fact that our engineering team thinks it's relatively simple to build means it must be equally (if not more) simple for Google.
Rand, amazing insight! I believe the fundamental issue with using links to judge results is that the vast majority of web users consume content, while that 1% that creates content (and is web-savvy enough to build a website) controls rankings. Thus, not *that* many folks are competing for a *lot* of traffic-- creating these opportunities for link spam.
A more fundamental solution to consider is to give a voice to that 99% percent of the population who doesn't know what SEO even is, but can certainly be an arbiter of what content is helpful. Enter social media, where likes are the new links.
When you have a car, there is only so much you can do to increase the horsepower and make it faster. But modify as you might, you'll never beat a jet. Does this mean that Facebook is going to trump Google because of a more democratic and less gameable system? No, it just means that search is a utility that cuts across websites, but is less seen as a site itself. And that search is more useful for sorting through structured and unstructured data (like finding the lowest fare or seeking the definition of a word), as opposed to items that require more social context.
Who in their right minds is going to link to a site with "hard money lender seattle"? Now consider how much more likely someone will write a testimonial about how that lender saved their construction project when traditional bank loans failed?
Putting on my cynical hat for a moment, I wonder if Google doesn't really care so much about bad organic results for these specific searches because they want users to look to the AdWords side of things instead.
Interesting idea, I was thinking the same thing :) But then again, I'm cynical by nature.
You might have a point as they are moving the organic results way down anyways....but it is like biting the hand that feeds them.
I would say that yes, you're right. But my question is this: If Google removes the instances of spam as you say, do all results get better? It is my inference that, in many instances, they don't, because they are in terrible neighborhoods where obtaining a link requires some extreme manipulation, no matter how you get around it.
Because of this, I believe those sites returned at the top of the results in these taboo verticals are a better indicator of quality than if Google eliminated manipulative links altogether - because editorially cited links would be more random chance than anything else.
Do I believe these sites returned are overall better than those returned in verticals where editorially cited links are the norm? No, definitely not. But that's the nature of the beast - when you have 20 people weighing in editorially on a website instead of 5000, variance will inevitably come into play.
But I do believe, given the circumstance and nature of these verticals, that a situation actually exists where ability to obtain links in a manipulative fashion, consistently, is a better indicator of quality than the opposite situation - one where these taboo verticals are only editorially crafted.
This kind of SERP, to me, would do Google - or the users - no favors.
If right now many (if not all) normal queries are affected, wouldn't it be an acceptable loss to give these 'terrible neighborhoods' a slightly more 'terrible' SERP than they have now, in exchange for far better SERPs for normal queries?
That's a really interesting perspective. Certainly, if true, it could be one reason why Google hasn't implemented something like this.
I guess my hope is that the web is now big enough and there's enough non-manipulative links out there to make for pretty darn good results without the help of this type of anchor text link manipulation on commercial queries.
Yes, because good resources not neccessarily needed to get a spammed site to the top. Besides that, spam sites are almost always a shell of a site only offering the very minimal amount of copy & that is usually automated somehow.
From a users perspective, it's frustrating.
Since caffeine, or even earlier, the quality of the index has decreased.
Just when Blekko came out, I said, "Hey, this tiny company can almost filter out spam sites better than Google!"
Great post, it really is annoying how bad some results for highly competitive keywords are sometimes.
It sounds like you really thought about this and it sounds plausible- to be honest I'm just not sure about #3, google manually going through the pages. I guess in many cases, even for a human, it's not that simple to identify whether links are legit: Like a happy customer linking back to you using the perfect anchortext, this is certainly not a bad thing, right?
But as you say, although this approach could certainly be tricked in a clever way and is not foolproof, it would probably detect most of the obvious spammers and considerably increase effort/cost to build new, manipulative links.
When reading this, I also thought about something else. With all the talk about LDA, would it not be a viable step just to lower the importance of anchor text? After all, google should be able to tell what a page is about, right? It's just a thought, but putting more focus on long term, authoritative quality links and less on the anchor text could maybe help solve this as well.
The problem with turning down the link popularity or link metrics knob and turning up the on-page, content relevance knob is that you'll get results that are filled with Wikipedia-like articles telling you about the search term/product/service in depth and detail, but likely not many good commercial results. When I search for "laptop cases" I want results that sell awesome laptop cases, not deep content about what a laptop case is and all the concepts/words related to it.
Still, I agree that some sort of greater content analysis could be used perhaps on the linking pages to determine if there's relevance between the page's content and what it's linking to (or the anchor text it's using). That could be another way to approach the same problem.
You are -once again- absolutely right, a description of a product is probably not what one is looking for in most cases. And I guess not all searchers want to add "buy" or "purchase" to add to their query.
Maybe a combination of devaluating those links and putting more focus on the context of the linking pages and less on the anchor text would help. Im confident those clever people at google will figure something out!
I would think this to be very plausible. Anchor text being devalued while the relevancy of the linking site's content to your site take higher priority.
A site talking about laptop cases linking to a site selling laptop cases should be more relevant than a link with exact anchor text from a site not relevant to laptop cases.
Then a link with exact achor text from a site relevant to laptop cases linking to a site selling laptop cases would be valued even more.
Rand: your statement about turning down the link value, and turning up the content applies particularly well to those sites that got hit by the MayDay filter. That was the precise behaviour that we noted on all affected sites.
Interestingly, it appears that google has (although not entirely) 'relaxed' this update now, specifically at midday mountainview time (thats a real timezone now right?) on the 27th of October we have observed a large swing on 1,000's of serps that we monitor, with traffic coming back to affected pages instantly.
This leads me to my second point, where I respectfully disagree (again) on the subject of link/anchor text manipulation.
Your point of view is totally understandable from a search purist perspective, however on inherently commercial terms perhaps a lot of the time having the person with the "deepest pockets" (or just, best SEO people) at the top of the list is most likely a good thing.
If the whole world knew more about SEO, the organic links would be thought of much more akin to the adwords results, where webmasters (and ultimately consumers) would appreciate that the best run, funded, creative and competent companies rank best. The problem now is not that a few people abuse the anchor text bias in google's algo, but that not enough people abuse it ;)
Rand,
I do agree there is a little bit of an issue but all this might do is increase the value of those links and limit the people who can pay the money or provide the influence to secure them, a lot of smaller businesses who don't have vast resources will be pushed from the SERPs leaving only big players who are spending 50% of the clients budget just buying links no more effort just lets open and wallet and how much do you want to pay.
There needs to be a focus on verticals such as pharmaceutical as right now fake drugs and counterfeit products are hurting consumers. The issue around adsense websites is secondary to actual websites that sell poor quality or dangerous goods like the muppet highlighted in this weeks NYTimes post. Businesses don't have to bid on the content network which powers adsense but they do so because the traffic is often targeted and relevant, they encourage and provide resources to these websites.
The issue is being focused on too much is that an affiliate or lead generation website can outrank a real business is down to business is not investing in SEO. If the money is right any link is attainable, i guarantee you can even buy links from Google!
We also need to be cautious because both "commercial" and "informative" posts are both valuable in different cases... and how does Google know what you are looking for in this instance?
Using the "laptop cases" example, I could be looking to buy a new laptop case, and therefore want commercial sites. On the other hand, an ArsTechnica article that road tested 10 different laptop cases could be what I need, if I don't already have a case in mind.
So my first search for "laptop cases" would be for information-gathering purposes. My second search would be for best price. (Admittedly, I would probably pick the winner from the article and Google for it directly, but you get the point.)
Maybe Google splits the results into two columns: Commercial and ... ummm ... "Other"? Or "Organic Links"? The ads could be interspersed with the Commercial results, to still provide the ad revenue.
I do not envy the Google Spam Squad's job. But I thank them for it every single day!
But they already have ways of guarding against that by looking at how many people go back to the SERP - if I search for laptop cases and click through to some content telling me about laptop cases I will go back and look again... Unless I really didn't know what a laptop case was and why it was important, then I would read it to my heart's content.
Google now have so many factors they measure that are 'indicators' that all they ever really have to do is turn those dials. I have to admit I think this is maybe more of an issue in the US as most major search terms in the UK return pretty much the results and suppliers that I would expect to see. There are perhaps a few specific industries that seem more spammy than others but generally I wouldn't say that was the case. If anything, the sites with the link profiles that most closely match the issues you raise in the post are the ones that are struggling the most versus their competition.
I'm not sure where I'm going with this really, but I guess if Google implemented everything you suggest then it might help reduce spam in the US and completely screw up results for the UK. Is it possible that Google are trialling anti-spam metrics in the UK before rolling it out in the US?
@FranktheTank,
I agree with you that happy customers linking to you with the perfect anchor text would be a good thing and, unless the webmaster running the site their linking from is posting a lot of other links on those pages fitting the profile, that link shouldn't be devalued in Rand's model.
I do think that there are good cases for links on pages but that they are just far too abused. The exact match anchor texts maybe should just all be considered "Sponsored Links" and valued about the same as no-followed links as you suggest Rand.
Hopefully Google will do something about all of this soon.
Chris
@Chris:
Devaluating all exact match anchor texts could become a bit of a problem though, it kind of sucks when you own keyword.com and then all your regular links are devaluated :)
First of all I think that this post from SEOmoz was somehow awaited, at least I was looking for a post about this topic from you Rand.
That the SERPs, especially in the case of a highly competitive keywords and markets, are somehow polluted by search-spam is a so evident that probably Matt Cutts and his team is already working on it, as the announce of a revision of the influence of exaxt match name domain may point to.
Your suggestions, that seems quite actionable, are good one, even though I too agree with Frank that manual revision of suspected link spam can be the toughest to put in practice because of the operational difficulties it supposes.
But the problem resides also on our side. When you refers to the phrase "SEOs ruined the SERPs", I'm sorry to say that that phrase is partly true.
I am not saying that the Good SEOs would have to start an Holy War against the Bad SEOs, but yes I believe that the politically correctness of being silent is not the right way to fight search spam. There should be a way to better - for instance - the webspam reporting, because imho it doesn't work: just reflect that it finally punish the websites that break the rules, but not really the "SEOs" who broke the rule. How to obtain this? Honestly I don't know how to answer to this question, at least not now.
About how to clean the over polluted link graph... maybe the problem is in the monolitic unicity of the Algo. As you say, Rand, right now it is not really the case to rely more or just on LDA/Semantic/Contextual/On-Page factors, because that could damage also "White Hat" commercial websites.
Maybe - please notice that I'm just thinking loud and share "theories" - something could be algorithmically done to distinguish between informational kind of searches and purchase driven ones. In that way could be splitted "wiki/news" kind of sites from "commercial" ones, and then create a variation of the Algo for these last sites which could give even more relevance in citations (reviews, comments...) and social buzz. I know... that should mean diminish the power of linkgraph somehow, but I believe that actually (and more in the future) the importance of UGC/Social Buzz has to be a main factor in the rankings.
Finally, another possibility would be to give more weight on "TrustRank" and diminish the power of "PageRank".
But, as I wrote before, these mine are all "brainstorming" thoughts I like to share even though I am not so sure how much actionable they could be.
I like your idea to algorithmically split searches by type without requiring the user to type 'purchase' or 'buy' into their query. Ask has been working on their language processing algorithm for years and are getting pretty good with it. Something similar for Google would take some time but I think would also make them more competitive in the areas where Bing and Ask are the stronger engines currently.
Chris
Unfortunately, it would be better to say "Ask were..." as it has closed just few days ago as Search Engine, as you can check here... :(
I'm not an SEO professional, but found this article linked through my daily reading routine. I didn't make it very far through the article, but this stood out.
It's getting bad enough to where I actually worry that early adopters might stop using Google for commercial queries and start looking for alternatives because of how manipulative the top results feel. That behavior often trickles down over time.
I've been using DuckDuckGo as my primary search engine for about the past 4 months -- especially for very specific searches. I'll go to Google when I need something within a specific date range, but Gabriel @ DDG just rolled that feature out there, too.
Just did a search with "SEO".
I like how it presents alternatives but I find strange that SEOmoz (that appears twice) is listed so down in the rankings.
Maybe the guys of Duck Duck Go could... well, not tell the secrets of their Algo. but describe in general terms how it works.
@Rand: in your longing for "clean-old-style" serps, maybe you can find it interesting
Post Scriptum
As Malbiniak didn't put a link, I think it could be interesting to actually have it for general information. Here it is
AH! the not optimized anchor text has been written on purpose due to the nature of the post and of many comments;) .
That DuckDuckGo is actually fairly cool maybe suitable for keyword research, check out the examples
Duck Duck Go AdWords
Duck Duck Go SEO
Note: I used the product name as it's cool and all the links are nofollowed anyway... i guess if i had of used "Search Engine" that might be classed as keyword friendly...
Nice to hear your opinion... and my not exact match anchor was a (maybe pitiful) way to put some irony to the post :)
so the question is does the community turn away from helping clients to focusing on doing google's job for them?
I've been trying to turn people onto DuckDuckGo for well over a year now, glad to see it's catching on!!!
I first heard about DuckDuckGo from C.J. Jenkins blog. Imo, it's one of the best search engines out there that few people use/know about.
I'm a huge fan of Gabriel and I think his David v. Goliath story is both compelling and enjoyable, but my experiences with finding what I want from DDG haven't been great (yet). Google's commercial results are struggling against spam and manipulation, but for the long tail and branded searching (i.e. finding a specific resource on a specific site), they're still head and shoulders above everyone else.
Hi Rand,
I absolutely agree - working in B2B it frustrates me seeing people getting 1000's of links, on exact keyword matches, for subjects that whilst interesting - lets face, would in a realistic world be better judged as getting 10's of links. This then just makes the SEO industry worse, as agency's claim "well I can get you a top 3 position by buying 2000 links".
This then delivers a worse result for the user, and Google results become worse as they try to demote sites that have fallen victim to SEO agencies following this line.
I'm not sure of what the answer is, as gaming these things is easier, but if search engines could pick up the social side of things better (things like LinkedIn recommendations, Facebook likes etc) - and compare this with links, then I'm sure there is a way to find a better result. But this then relies on other companies doing their bit to fight spam. Afterall - how much spam are we seeing on FB, Twitter and LinkedIn these days - and they dont have a public Matt Cutts.
Social media is just as easy to game, maybe easier....how about just counting the first x amount of instances of exact match anchor text, maybe on a declining scale?
What do you do in extremly competitive verticals that have no naturally given links at all? Im thinking poker, gambling, porn etc. Show nothing?
True, my friend worked with online gambling for a year and his monthly link budget was $25k, there were five other SEO's with the same budget! I am not sure there is a solution for big brands and I seriously doubt if Google will ever even bother.
Something definitely needs to be done....but will it?
I mean I really hate to say it, but I'm considering going over to the dark side for a few things on some sites that I work on.
Currently there is no incentive for me to spend 80-100hrs a month on link building when I could spend 5-10 getting smart "black/grey" hat links that won't be detected and that will add a lot of value. In fact, I'm losing out by doing it the right way.
Anyone else with me on this?
Hmm funny that I posted a similar article last night to YouMOZ, altough yet unseen by moderators. Totally agree spam is ruining everything and truely no one ever links to you with the correct anchor text unless your keyword is your domain. Usually CLICK HERE or HERE or just your URL are used.
But...plenty of sites perfectly legitimately offer "link to this page" code, and of course they've inserted a relevant anchor for a phrase they believe/want the page in question to rank for. There's no way Google can manually review the Internet. They can offer to let their users do it, but as we all know that'd just end up putting us in an even worse position as companies pay armies of people in developing countries to thumb down their competitors.
I don't think there is actually a workable answer, but here's a thought. How about Google give us an option to never see a site in our results ever again. I know I'd use that. I know I wouldn't abuse it either - I need to see my competitors sites. It might be algorythmically possible to spot "real" thumb downs vs manipulation attempts.
Andy - great example, this is precisely why I'm suggesting that you do it not for individual links on a page, which could legitimately be there and intend to point with good anchor text, but only when there are blocks of them pointing to multiple sites, which are, let's face it, almost always manipulative.
I ultimately agree with Rand, very good post btw.
But, the reality is that people will always game any system.
So let's say that we move to a more editorially based measurement. All that will happen is those people who are quick to identify the change, usually black hat because they are constantly pushing the search engines, will then move to manipulate the link profile through bogus editorial (maybe paid for etc).
For example, they start to write great content, which gains natural link value, but if everyone is writing great link value, how then do Google determine the top 10, top 100 whatever?
Like any commercial enterprise it will come down to those willing to invest and those willing to take chances who will gain the most.
I personally will never go black (or grey) hat, but maybe, just maybe, I will never be as commercially successful as those who are willing to go the unethical route.
And thinking further about ethics and business, the reality is if you asked 100 MD's if they could be a little bit unethical, but double their turnover, i'm guessing 99% would say yes. But at the moment I say to MD's that if we went black hat we could be blocked from the search engines, they always say no.
SEO is business, and business is making money, while I don't like the black hats, I can understand why they do it.
totally agree with your comments, we can only work with the tools and rules set before us. Google will probably take this debate on-board and hopefully improve their serch results based on content and not so much on anchor text.
As SEO's, we see search engine manipulation more clearly than 99% of the population, perhaps even 99.9%. As it is, with up to 90% search market share in some places, why would Google change?
I'd prefer we concentrated more on getting an accurate and consistent keyword tool, than worrying about paid links.
Chris (on Scot's account)
I am curious Rand, if I own a blog about printer cartridges and say I publish posts on which printer cartridges have the best value (cost vs. sheets printed) and someone approaches me and wants to advertise on my website. I look at their website and see they sell various cartridges at good prices and are in good standing with the BBB, have been in business since 2006, have a nice, well cared for website - I conclude that my visitors would benifit from doing business with this company. I agree to give them a banner spot on the top of my blog and also a do-follow blogroll link with the text, Buy Printer Cartridges and they pay me $xx.xx per month. Why should this link not count as a vote?
Don't get me wrong, I do see tons of useless spam and companies ranking by churning it out but I don't think it is fair for Google to use their monopoly to force people to spend their advertising dollars on adwords instead of buying it from the little guy.
JG
It shouldn't count as a vote simply because there's no way to verify that you're the type of person who editorially reviews the quality of your link buyers rather than just selling to the highest bidder, no matter how dirty.
I think the engines' preference (and a searcher's preference) would be that if you found a business you liked, you'd run an ad for them (maybe a banner ad or a text ad with a nofollow) but you might mention them when relevant in a post with normal anchor text - like their company name. Those types of links are far more likely to be editorially given, and they're what I'm suggesting the engines should be counting.
I will always disagree with you on the point that a third party that has a vested interest in me not taking advertising dollars away from them should dictate what is worthy of a vote and what is not, on my own website. It just defies logic in my mind.
It is not a matter of right and wrong because I hate black-hat spam as much as the next guy but I don't think tanking a few hundred thousand innocent sites in order to get to the few thousand that generate the spam is the answer.
In addition, if they did what you are proposing then a zillion more news and reviews sites would pop up selling editorial reviews. Spammers would build a hundred Wordpress review sites a day and populate them with fake reviews and testimonials and then we would just be back at square one?
I disagree with you here. I think the end result of what you suggest would be to end small business presence in the SERPs in place of big brands that can afford the biggest marketing campaigns. I believe that the results in most serps are to relevant websites. I think that 90% of the sites that come up are using anchor text that they are relevant too.
Here is a thought:
It could be that they have already tested this. And it could be that they found out the (satisfaction with the) results didn't improve. :-)
Since I focus on completely white-hat methods myself I would love to believe the opposite. Will the search experience get better if all paid links could be automatically devalued? I really hope so.
-sasa
Totally possible - I agree. One of my first thoughts when wondering about this methodology was "maybe they've already tried it and it failed to produce better results." Honestly, I hope that's the case given the relative simplicity of the implementation.
My other thought is that it failed because of the ratio of false positives, but then, why not simply increase the number of commercial anchor text matches - surely pages that have 5 or 6 or 10 commercial anchor text exact match links can be filtered even if 2 or 3 is catching too much good stuff.
I think the easiest solution would be eliminating ALL footer sitewide links as a first step. No site naturally links to other sites from every page unless it's a sister company, and Google should be able to tell the difference. Want to give partners credit (like, say Mashable) with 7 sitewide footer links? come up with another idea.
This is by far the most common Google manipulation trick and why it's still working is a mystery: from spammy Wordpress themes to crappy SEO's and web building companies using their clients footer as a link garbage bin.: Google treats them as pure gold.
Another suggestion: let webmasters and site owners opt-in a plan where Google pops a small modal window or similar solution, asking the (verified Google) user who just clicked on your site to rate its relevancy, and do it randomly to minimize manipulation. Don’t want to opt-in? Loose Google trust rank points.
Roie,
You mention 'crappy web development companies' using the footer to add links.
I think it is perfectly acceptable to have a link back to the web design company that built the site in the footer of the client site. The client will have definitely seen this before going live anyway in most cases, and would have said something if they didn't want it there.
This is a legitimate use of footer links. I agree that spammy links in Wordpress and other 'off-the-shelf' website packages that can not be controlled by the user are negative and of little value.
At the end of the day though, if the website is actually worth looking at, it will have been built professionally, and probably will grow to have a good standing in SEO terms (e.g. link profile). Useless/spammy sites are often short lived and go offline within a year. If a site has been built professionally, it will likely not include those spammy links. It will have good content. And it will probably have a link to the web site development company that built it - and that will be there because the client thinks they are worth linking to (the company did a good job).
When I first started my web development company, I did not have time or resources to go out building links to my site in volumes high enough to compete with others in the SERPs. The links on the sites I was building were a lifeline and they were key to helping build my business to where it is today - and it was all based on legitimate hard work which my clients were happy with - hence why those links still exist today and some have been there for years. Even in cases where they have got other developers to help modify the site, I can see they have still left our link in, out of thanks/respect. The client probably doesn't know the value in terms of SEO, but they DO understand that it will be seen by their own customers - and that is the point.
So back to the original post topic - I can see where Rand is coming from, but the above should also be considered if Google were to adopt a strategy like this.
Matthew
Matthew,
I agree from a marketing perspective so a search for "webdesign" shows iNowWeb.com as one of the top results who have 127,694 links from 2043 domains according to Linkscape. With 106,848 of those links say "web design" what portion of that would be considered legitimate and what might be considered spam?
>99.9% would be my guess.
I'm sorry, but i disagree. Your clients pay you money for a job, and that's your reward, not free advertising for a limitless amount of time. IMHO this is not similar to car logos for many reasons: it affect search engines not only people, it may drive traffic out of your business (which my company car logo will never do) and it's not mass produced.
In addition, I don't know your work, and maybe you build the best sites in the world. But honestly speaking: most sites I've seen built from scratch are outdated, difficult to administer and full of bugs clients have to live with, and this definitely means leaching a free link and logo is plain wrong.
But I think you've missed my original meaning: Google should treat external sitewide footer (or sidebar) links as they are: often paid for, meant to manipulate, have no added value to a site and are the clearest signal of SERP scam.
I've always felt if a designer or developer should either be paid, of have a footer link. I don't see how they should get both. Any developers who've tried it with me have been posed the question of whether or not they want the link, or to be paid. Guess which they go for ;)
It's not natural anyway. If you've got a website on a topic other than web design or devlopment, then your visitors aren't looking for that so it's contextually irrelevant. If the developer wants a link, it ought to be on the "About" page where it can be contextually relevant, and in a place someone who wanted to know who put it together might logically look for it.
What if I am an auto mechanic and I want to exchange promotion with my buddy who is a body shop. We have been doing business together and recommending each other for the past 14 years, why should that site wide and/or footer link not be there and also count in the algorithm.
That "recomendation" link is not bad in the way you put it... but it's wasted in the footer. Instead why not follow the example of SEOmoz with their partners Distilled and ExactTarget (you have just to scroll down to the footer and click and... surprise...)
Because you're not doing it as an editorial, natural recommendation. You're doing it to manipulate Google's results. We link to a page that talks about our partnership with Distilled, but I think you'd agree it's a pretty natural way to link - meant for humans and partially helpful to engines, too. But it's not like putting "SEO Consulting Seattle" and "SEO Consulting UK" in the footer of every page and trying to game Google's rankings. The problem is, recommendations like ours - natural ones - are losing out to ones like what you've posited, which, no offense, are clearly manipulative in intention and execution.
Right Rand (about editorial link)... but on the other hand the footer link is not also "devalued" by the Algo? Or that's what I am thinking. In that sense I was meaning that was a "wasted link"
Many business people and even webmasters are not all the savvy and placing a recommendation at the bottom of a page might not always be manipulative.
Great post! The best on this blog in a while.
I disagree with the last point though. Having data about websites that obviously sell link would be a very helpful metric. Maybe not for scoring top rankings in December 2010, but probably for top rankings in the future.
I'd really love to see that metric included in OSE - especially if it's not that hard to implement.
Just my 2 cents - from a perspective where I don't wont to waste my teams' time collecting links from sites that are can be flagged as spam through algorithms.
Thanks!
Thomas
Great idea, and I do agree.
For too long have Google valued link campaigns and paid links over good content. (Paid links on an 'informal' basis, i.e. not using an easy-to-identify scheme like TLA, that is)
I know that Matt Cutts highlighted this specifically, although to be honest I doubt that Google will do anything about it.
I want something to be done about it, of course, but I think it's just sabre rattling.
I think this because Google were happy to take people from the anti-spam team and move them to other projects. And when spam started rising, Google didn't change anything. The problem just got worse. This clearly suggests that it isn't a major priority of Google.
To illustrate this point: if a big retail chain made some changes and sales started falling (the equivalent to spam rising in the SERPs), the retail chain would think "Hmm, our priority is making sales. Lets get sales back up". But Google didn't do this. Instead they just carried on as though an increase in spam is a good thing. Hence IMO it's clear that anti-spam isn't Google's main priority anymore.
Secondly (further supporting this point, IMO): Matt Cutts made a blog post called "What would you do if you were CEO of Google?" (https://www.mattcutts.com/blog/ceo-of-google/) in which he asked for "big ideas" such as self-driving cars and other non-search related ideas.
For ideas, Matt linked to: https://www.project10tothe100.com/ideas.html
And guess what? NONE of the projects have anything to do with search.
So again, whilst Matt Cutts spoke out about this (and I fully support tackling spam in the SERPs), I don't think that it's a priority for Google.
I hope I'm wrong though:
I'm good at writing content and making great websites which benefit users. I'll happily spend many hours writing a 1,000+ word article with pictures, videos, natural (i.e. dofollow, unrequested: helpful) backlinks, etc. But as it stands, Google don't want this. "Helpful content? Bah". They want quasi-spammy backlink campaigns over good content. And Google's actions don't suggest that this will change.
Again, I do hope I'm wrong though.
Google created this beast, and I hope they'll fix it. Self-driving cars sound cool, but I'm still hoping that Google's priorities change back to search.
If the past year or two are anything to go by though, I don't see it changing.
Anywhoo, rant aside: amazing blog post. The theory you suggest seems very practical. I'm not sure how Google approach this sort of thing, but what you've said does seem sensible. After all, any unspammy site wouldn't be linking externally to various other sites using such specific anchor text.
What you suggest seems a relatively good way of starting to weed out paid links and link farms. It shouldn't be too computationally expensive, either.
From last one year I have noticing Google has been completely ingnoring the Web Spam may it was because Matt Cutts was busy climbling mt. kilimanjaro. What ever the reason it might be but it causing a serious concern for SEO who are utilizing the White hat techniques to rank well.
Because of this I have seen many good website also indulging in Black & Grey Hat SEO Techniques to coupe up with the damages. It may be possible that Google is planning something really big to terminate web-spam completely or they are just figuring out what they should be doing. If Google is reading these then they should definately accomodate these changes in their algorithm, as I think effective with dealing the Spam.
Hope the Decemeber updates bring some Good Chanages that will fight the Spam. Till then keep reading SEOmoz:)
To be fair, the guy's worked long, hard hours for years and years. I think he deserves some time off and I don't intend for this post to be a specific critique of him. He was the first person at Google to identify the problem of webspam and pushed to have a division created to fight it when Larry + Sergey didn't believe it was an issue. He's done a ton for Google's relevancy over the years. I'm just worried that all those long, hard years will mean only temporary victories if many others don't step up to the plate and follow in his footsteps, though.
An equally easy way to deal with this is to devalue links from sites where you can find things in the source code such as "<!--TLA Paid Links HERE -->. It is crazy when I stumble across patterns like Rand described, view source, and see comments that are blatantly saying "Hey, these are paid links".
It is almost more amazing that paid links, and others in the publishing side are becoming as lazy as they are.
Anchor text back-links are probably the most abused elements used by SEO companies throughout the world, furthermore; the micro-sites that they are located on are a sign that this particular ranking factor should not be trusted in the vast majority of cases.
I came across one large ecommerce site last month, that was using at least 25 thematically related micro-sites which contained keyword phrase anchor text links pointing back to the main ecommerce platform. This site had great positioning for a lot of competitive phrases.
The vast majority of Internet users would not use targeted keyword anchor text to link to another site, surely Google would know this fact.
Until Google find alternative link metrics to measure the perceived importance and relevance of a site, the anchor text back-link abuse will continue.
Great post and some great follow up coments thoughout. Maybe this should end up on Matt's last blog post asking what you'd do if you were the CEO of google: https://www.mattcutts.com/blog/ceo-of-google/
Ha, and All I wanted was a jetpack
I have a feeling the group that turned a visible sign for punished websites would not be the SEO department but the legal department. Can you imagine the lawsuits?
If that were the case, I suspect we'd already see the lawsuits for grey PR bars, PR 1 scores, etc. I can't think of a legal leg anyone would have to stand on.
Rand... I think that Mert comment could be read in the context of the actual investigation of the European Union investigation after the legal complaint by Ciao to Google for manipulation of the vertical searches. Someone could try the lawsuit way if Google will show a "banished" PR bar.
Rand,
Gray can be interpreted as I am not sure rather than it is bad. Red means do not visit. As a Euro-American, I can tell you that every European country company would sue Google if for nothing but slander. I do not claim to be a lawyer but I know that every Google decision goes through three people (the engineer, the lawyer, the PR guy). I bet you the color gray was even decided by the lawyer.
I'm a bit late to the party with this one but...
I would echo the sentiment that Google don't really care so much about spam any more. They are a corporation and ultimately their focus is profit - this is what shareholders demand. If it doesn't have significant monetary value to them, they won't do it. Indeed, a constant, radical shifting of results and changing of algo surely wouldn't have a positive effect on shareholder confidence and thus share value...?
So whilst I can see them making continuous, small changes that improve the search results for users, I really can't see them looking to make big changes for search quality, and this may fall into an area of 'big' change. Maybe Google feel that maintaing a certain level of search quality is good enough for their commercial goals and that anything further introduces risk to share value?
Also have to say that IMHO any references to 'ethical' or 'unethical' practices are misjudged when talking about Google. From what I've seen over they years they appear to operate with no such regard and are very much a corporate entity in the way they behave, unless it serves their PR purposes, so I'm always a bit amiss as to why SEOs refer to practices in this way...just my 2c.
Let's face it - the idea that a "link is a vote" is surely the problem. The idea that you can create "White Hat" Backlinks is ultimately a fallacy.
Even if Google devalues links with optimized anchor text then more value will be given to unoptimized links and then people will buy those links and we are back on square one.
I totally agree that Google need to take a close look at webspam driven by focus anchor text. It is getting to the point that it is hard not to justify doing it ourselves! After all if all your competitors are doing it then you have to think about the short term as well as the longer term. It is all well and good saying to your boss that in the long term all will be well but they want to see results in the short term and are generally less interested in the specifics than we are.
I've seen a case where wordpress blogs are being put on expired domains (from things like trade shows) which have been snapped up by spammers. Some of these have pagerank still (sometimes good ranks) and are just there for manipulation. e.g. they read like:
Some content, etc.... Viagra link content, more words etc....
I'm upset and frustrated. Google needs to do a better job if it expects seo's to stick to the rules!
Good show to the poster for proposing a solution rather than just winging. Unfortunatly, The way I see it, there are three big problems here from Google's point of view.
Firstly, the web is far too vast for any significant manual review. I would be surprised if they are able to even keep their eye on the top 5% of search terms. The sheer amount of data they handle is just too vast. This is why, back in the day, search engines with all their issues took the market share and manual directories and the like didn't.
Secondly, with this in mind remember that Google is not actually intelligent. The longest algorithm in the world is still an algorithm. A computer cannot tell the difference between 'Red chair' as an anchor keyword and 'Red Chair' as in 'The Red Chair Company'. (Just made that up, now want to start up a company called 'the red chair company'. Hmmm. Fate?) What happens to the thousands of companies that happened to name themselves after their products? Do you just say 'hard luck, we're going to assume your spamming chaps. Sorry' and then just ban them? Lawyers would have a field day. Do you discount anchor text completely? That worked out great for Google's predicessors.
Thirdly, if your going to start putting flags in the Google toolbar, or indeed applying any clear penilty to a website itself beyond discounting suspect backlinks, you are giving a gift to black hats and spammy linkbuiling companies. Competitor won't get out of your way? Buy 100,000 blog comments for £100 to their site and watch them plummet. It's Google they're going to sue for slander, not you. Again, is that better than now? I don't think it is.
There are genuine advantages to White Hat SEO, beyond the SERPS. Its an easier sell to clients. It integrates better, and compliments wider marketing and PR methods. White hat techniques can often generate traffic themselves, quite aside from the SERPs. It often drives better quality traffic. The long tail is, well, longer. Its less of a gamble - your business or client isn't always one google review away from being kicked off the SERPS and sinking. Are those things worth being 5th instead of 3rd for a handful of keyphrases? I'd argue they are, and most of our client base seems to agree.
I think you might have misread the post a bit. Pointing $100K in anchor text links at a competitor will do nothing but cost you $100K and have no impact on the link graph. That's the beauty of this model.
I think the problem may be that Google is crawling too deeply and therefore picking up all sorts of garbage links that in reality should be ignored.
If you look at the SERPs in Blekko and then compare to Google, the difference is down to Blekko not crawling so deeply and therefore missing all the garbage profile links and so on. Google has just become a victim of it's own spidering power.
You know what the bad part about this idea of defeating webspam is by searching for exact match links? You are telling people that if they have too many perfect links pointing to their pages, they will be penalized. You can bet your horse that people out there who want to tear down competition will just have a bunch of indian link builders, automated link software, article distribution sites, and the list goes on and on to build the links with exact match keywords to trigger this penalty you talk about.
You have to think from a perspective: How can you reduce spam without giving an unethical person the power to penalize a competitor's site.
I think you didn't read the post, comments or my replies carefully. I specifically noted that these links should have their rank-boosting power removed, not the targets penalized.
Relevancy of a link should be weighted more on KW in the title tags than the anchor text...
And the difference between having enough money to hire a world class seo team to do a comprehensive effort to rank high and having enough money to buy large numbers of paid links to rank high is.... what exactly?
Link quality, sustainability, targeted placements through relationships.
Buying links should never be the only strategy.
Google is completely forgetting something...and that is that I can easily beat my competition by buying "exact anchor text" links linking to him. I can do a bunch of article submit and I can go to payperpost.com and buy $1 links from all sort of websites with less than a couple hundred bucks I could kill my competition hence making thousands of dollars!
OK I haven't done this but is had been done to one of my clients...and we did contact Google webspam team telling them we had nothing to do with those links and if anything, they should not count them not penalize us but they came up with some generic answer BS and problem still remains.
I don't know how you do it Rand, but how does one read through 122 comments and give a competent response to not only the article/post, but also not be redundant in your message?You pose an interesting question that I'm sure the Google team's either heard or possibly looked into, if not applied (and maybe taken out of their algo). However, I think the question definitely depends on the perspective you take it from. From the SEO, Google's, the consumer, etc. If it is so easy from the engineering team's perspective, why not build it? As Nike says, "just do it..."
The spammers are definitely winning. I can list website's in the double digits that are using black hat techniques that go directly against Google's so-called "guidelines", and I'm not even in a traditionally competitive market. Some of it is just obvious too, like link buying (which has gotten out of control). I've reported several to Google, which has resulted in nothing. This gives me the impression that Google just doesn't care or maybe they can't keep up. Either way, it comes off as not caring to me.
I've come to the conclusion that the black hats will always win. If only they'd use their brilliance for good!!
Good post, and I especially like #7 - there's still, imo, way too many people obsessing on those little green pixels.
An approach like this makes a lot of sense but it does seem to punish those that broker links rather than those that buy them. Any reason for that? An alternate approach that would impact the buyers of links more than the brokers would be a measure of anchor text diversity on the link portfolio of a given site. Penalize those with little diversity. Isn't this done already to a certain degree? I'd be curious to know what you think would be more effective... going after the link brokers or the buyers themselves.
Well, my thinking is that no one actually gets hurt. Sell links or list spammy links and the worst that happens is you get a warning in Webmaster Tools and a Red PR Bar indicating the engine isn't counting those particular links anymore.
Buyers are, IMO, the ones who do get harmed because they're essentially seeing that the money they've spent is wasted and even in cases where it isn't obvious (the 25-35% that don't get marked but Google knows about), they're worried the money's being wasted. My hope would be that they go and start spending money on white hat marketing methods that actually add value and improve their businesses long term (good for consumers, too).
Now I see why you took the path you chose. Very smart to have a visible indicator that these tactics aren't working rather than penalizing everyone. I can see how this change would be easier to implement by Google rather than getting buried in reconsideration requests.
Great post Rand, certainly in some of the areas I work in this is a real problem, looking at our competitors OSE results there are load of pages with spun content and links to sites so different in their nature that they have no place being linked to from the same page never mind the same site.
I did read that Google were thinking of giving the title tag of the page the link was on more weight over the actual link anchor text, I think this would help in the short term as all these pages like the one I mentioned above that aren't relevant to the sites they are linking to wouldn't be able to pass as much juice, but thats only until the spammers re-arrange themselves to spam the page content.
Here we always chase natural white-hat links and its annoying to see people just pay for links and match if not beat our rankings. There's always a dabate that even with the investment in time to acquire links you are paying for them in one way or another, but in my opinion anyone can buy a load of links but real quality sites will make the effort to acquire more natural links, we just need some serious skills applied to detecting them!
Why not just devalue the use of keywords in anchor text? You said yourself that "real" links will rarely contain the keywords/phrases that are being used...so why doesn't Google just stop taking the keywords into consideration at all?
I think it's because exact match anchor text is incredibly powerful for finding a lot of non-commercial stuff, too. For example, people link to businesses with the name of that business, to pages with the names of products or resources on those pages, to blogs with the name of the person who writes it, etc. Since that info isn't always on the page or isn't always prominent, devaluing anchor text across the board could really damage search relevancy.
I think that there is a meta-SEO-spam issue that is being neglected. It's not at all clear to me that Google's consistently doling out a penalty for transgressions would solve the problem of SEO spam. Indeed, I believe the issue could be exacerbated by predictably punishing. I've written about the rationale behind inconsistently punishing (with regard to purchasing links) here: https://www.ecommercegeek.com/?p=97.
Although I share in your frustration, I actually believe that intermittent punishment is the proper course to combat link spam; it just needs to be done more frequently and with more publicity. I especially like your Pagerank toolbar suggestion.
From a psychological viewpoint, punishment needs to be consistent in order to be an effective deterrent and decrease a behavior. Speeding tickets are an inconsistent punishment for speeding - do people continue to speed? Even if they've gotten a ticket? OTOH, if you got an expensive ticket every single time you did it...
In a scenario in which sabotage (ie negative SEO) is not feasible, I agree that consistent punishing would be a better deterrent than inconsistent punishing. However, because negative SEO is fairly easy to accomplish, consistent punishment can be deleterious to spam fighting. Hence, I think it’s in Google’s best interest to keep SEOs guessing about whether this kind of linkspam will result in a penalty.
Great post Rand!
You touched on the subject of editorial links, what are your thoughts on that? I personally feel that it should have a lot more weight as Google always goes on about fresh/new/relevant content. If that's the case then surely a website being discussed more regularly in editorial is going to be more fresh than a website that has been out the limelight for years.
Should there be some temporary boost from an editorial link as it doesn't take long before it's lost in the archives?
It would make it much harder for the little guy to get found on the web and they would never be able to compete with the big brands, ever. Plus just because a website is not in the limelight doesn't mean it does not have valuable content worthy of ranking.
I think we'll have to agree to disagree. Editorial links aren't that hard to come by, so long as you have something remarkable that gives people an incentive to share. And, IMO, that's what I'd like to see ranking in the search results (and by and large, it's what already ranks in the results for most non-commercial queries, where there's little intent to spam).
I completely agree that the Google results are being gamed with some what gray hat linking building tactics and while I almost always focus on creating interesting copy to organically link bait people to link to my websites if you hire a link building company (boy there are a ton of them out there) in my eyes your paying someone to find websites that will allow the link builder to control the anchor text. It would be interesting to hear from a link building company that doesn't put keywords into the company name field.
Thus maybe Google should start devaluing directories as a whole. For example ThomasNet or GlobalSpec while they certainly provide useful information their results are paid as are many directories out there why not devalue them and force link builders more toward adding real content value instead of submitting to as many directories as possible.
So Rand, has Google called you up to offer you a job yet?!
Seriously though, I think you're spot on. It amazes me how Google continues to undermine itself by not changing or removing elements that allow people to game the system. Even the basic fact that you're basically encouraged by Google to build one site per topic/theme (in SEO terms) seems crazy to me.
- Jenni
I definitely agree with Rand that the spammy links should be penalized/de-valued and not neccessarily the sites acquiring them. Google has been rewarding spammed up pages in their SERPS for so long that many completely legitimate sites have thrown on the black hat to avoid drowning. If Google turned around now and black listed every site with paid links, they'd likely eliminate too much good with the bad and their results would be even less relevant to the searcher.
Also, those that think SERPS would be less manipulatable if Google measured "editorial approval" by Facebook "likes" or other social media indicators rather than dofollow links, are being naive. How hard would it be to create a number of phony profiles to "like" a certain page or topic?
I think that a serious look at the volume and origin of keyword anchor text from external links could actually be very effective in combatting web spam.
oh you mean you haven't started on the social graph yet? not crash hot on the phoney idea as Facebook will eventually find and destroy those profiles leaving you with an unnatural social graph.
it's not too late to start sharing your data with Facebook in exchange for those wonderful likes/shares
Just an example but Google gives this site text-link-ads.com a page rank 6 and they also rank number one for many terms. Isn't this sort of condoning paid links or are they just afraid of anti-trust law suits if they penalize businesses like this?
The only absolutely foolproof way for Google to remove manipulated links from the interwebs is to find every link, find out who owns the sites that they are on and go to their house and ask them an in depth set of questions about said links. Things like how many prescriptions form this company do you own? How many times have you rented a car form these guys? If they can't answer them the links value is removed. Simple as that. Sure it might take 2 or 300 hundred years but it a price well worth paying. But seriously I have noticed a steep decline in the SERP's. I was doing some research the other day on Google for some of our keywords and in one instance the number spot was a reported attack site. Seriously. The highly coveted number spot in a pretty competitive niche directly below the title "This site might harm your computer." If Google won't even remove a known malicious site from it's rankings (this wasn't even a notice through AVG it was straight from big G itself) why would it go through the trouble of removing spammy sites. Maybe we should just take the nets out back and bullet in it's head and start over. We have apparently screwed this one up beyond repair.
I definitely see your point, but I am having a hard time seeing what would take its place. Anchor text is a great way to tell what a page is about (when not manipulated) and I don't see on-page factors being a better way. If the search phrase has enough commercial intend, would the page need a “buy now” button and be attached to a payment processor?/Mikael
Anchor text is a great way to tell you users what page they will be going to; however, one should be able to infer from the content surrounding the link what the linkw ill be about anyway and so Google should be able to process the language surrounding the actual link and infer what the page is about. If the link isn't actually relevant to the content around it, it shouldn'e be there to begin with.
Chris
I agree, I don't see a real workable alternative. I think the made for adsense sites pollute the web as much as spam (and are in fact a form of spam) but Google has no compelling reason to get rid of them becuase they make money off of it.
I think links should still have value and citations/web references probably as well. Some social signals could likely bolster relevancy as well (and I think Google's already doing some of that with Twitter). My suggestion isn't to shut down the value of links or even the value of anchor text, just to find places that are clearly linking to multiple sites with multiple instances of manipulative/spammy anchor text and remove the value those specific links pass.
I agree, I just don't think it will be easy to draw the line and there has always been an un-level playing field for big brands that can get away with anything.
If as a website owner I decide to list my top 10 or top 50 favorite companies I do business with and I want to use descriptive text so my visitors know know what the companies do, I not only think I should not be penalized for it but those links should count positively in the algorithm.
Wow this is a great post, I hope Google takes a minute to read through this could def help with them cleaning up some of the results.
Rand. You make a good point, but wouldn't this create a whole new problem amongst online competitors? Google has no way of controlling who links to your website. You have to remember that smaller businesses can sometimes rank in their Niche with just a few hundred links. If your hypothetical became reality, my competitor could just link-spam the hell out of my site, and within a few days the majority of my links would be from "bad neighborhoods". So I would be punished with red PR bar and creepy messages from Google for something completely out of my control? How do you suggest they handle this situation?
oiewuf - I'm suggesting only that the links have their value removed, not that anyone gets penalized or banned. And as far as the warning in WM Tool or the Red PR bar goes, that only applies if you're linking out in a suspiciously manipulative fashion (and even then, it won't hurt your traffic). Thus, a website owner is in complete control of this situation and those who have bad links pointed at them won't suffer (nor benefit).
Yeah, after rereading the article I realized that I completely misunderstood what you meant when I posted this response. Thanks for the clarification.
I agree with this post conceptually as it's a huge problem and this seems to be a good way of addressing it. However, what would this do to part time bloggers who provide useful information on their sites? Why would they continue to develop their sites if they had no way to monetize them? A majority of blogs that provide useful information but aren't selling products need to have a means to survive. By no means am I insinuating that bloggers should sell spammy anchor text links on their sites as a means of survival, but there needs to be some source of funding to make the owners want to continue to run them. Take a look at the blogs that you read on a regular basis that aren't indirectly selling products, meaning they aren't an extension of another business. How are they being funded? How would they be funded?
By no means am I suggesting that purchasing links on a blog or something like that is sweet or should be done. But, if I ran a blog and someone approached me and said "I want to buy a text link on your site," my first response would be "how much?" Though the infrastructure evolves, the business model has already been established. In some cases it works (merit based links) and in others it doesn't.
This idea would work in theory but there are way too many exceptions that would do an injustice to those with value-added websites/businesses without other funding streams who would inevitably get penalized.
As for sponsoring themes which is bogus, why would designers spend their time creating professional themes if they weren't going to benefit other than a little bit of minimal exposure? If you take away footer links entirely on these things, what incentive do they really have to provide great themes? Branding? Perhaps, but how are people going to get to your their sites? By doing a search on Google for the designer?
I don't have massive amounts of data on this, but from my experience browsing the blogosphere, I'd say that a majority of blogs don't sell targeted anchor text links, but instead use ad networks, affiliate programs or direct advertising to monetize. This move might affect the small percent who bolster with manipulative anchor text links, but I'm not convinced that having he who buys the most links from bloggers rank #1 is a particularly excellent way to judge quality or build a great Internet ecosystem.
I agree but I will say that I was dealing with an ad network just a few weeks ago for branding purposes and when they sent me a list for their ad specs there were around 60 out of 300 that had a link option. Granted, it was roughly only 20%, but the sites included were ones that would be considered by most as authorities in their niche. I like the idea about tapping into Adsense to determine terms as it's an excellent starting point. Good post.
I hear you, but - maybe because my past is in pure marketing - when I think to ads on blogs, I think to them mostly as a way to make brand marketing more than a method of link building... I mean, not differently from banner ads in the NYT or any other online newspaper. The same should be said about sponsored posts: they always should be with nofollow links, as the "promotion" is evident and if the guest writer is good he will surely have his recompensation in the form of traffic, visibility, citations (tweets, stumbles, likes and so on) and second/third generation links.
I mean: the blog owner would still monetize his blog on the base of the number of the readers and many other factors and sponsors would still have brand visibility and others advantages.
About sponsoring themes, personally I have no problem showing the link of the designer in my sites, as it is a credit. What have to be more common should be the opt out possibility.
Rand, they have enough advice, but insuficient amount of brain, so should we care?
Oh gosh....yes please!
Google lads...include this kind of spam-aware thinking right now, eh!
:-)
Jim
PS thanks Rand...been thinking on this as well, eh and your post is spot on with the new google Place Search algo....
So essentially, you are asking Google to penalize sites for writing better anchor text which is encouraged on page 16-17 of the Google SEO starter guide? This should not fall under the spam umbrella and probably why Google has yet to penalize for it. Not only would your idea be a disservice to the user as these are the actual terms that they use to find content in the first place but Google would have to dedicate extensive resources deciphering which are legitimate “natural editorial links” written by say an SEO savvy copy editor and which not. https://static.googleusercontent.com/external_content/untrusted_dlcp/www.google.com/en/us/webmasters/docs/search-engine-optimization-starter-guide.pdf
Nope - not at all. I'm worried you might not have read the post carefully.
I'm saying Google should remove the value from the blocks of manipulative anchor text links that are often propping up poor results, not remove all anchor text value or even all commercial anchor text value.
Rand I can accept that you may be correct that I didn't read the post carefully or maybe it’s just that I can’t see how you can definitively tell a manipulative anchor text link from a natural one, in every single instance. I just think that too many innocent sites will get caught in the penalty net. That sounds way too much like the American Justice System. Someone will have to create the Innocence Project for websites wrongfully convicted of anchor text link spam. There’s an idea for a new business model. Then there begs the question of who decides that the results are poor? Isn’t that entirely subjective?
What a joke, way too much time on your hands. Do you even practice seo? Or are you one of those theoretical people who have never got a site to rank in your life. Why don't you invest the time it took to write this ridiculous post in improving your awful and overpriced seo tools, and leave the algorithm to Google and the people that actually know what they're talking about.
Nathan - it looks like you haven't used many of our tools in about a year (or the new web app); you should give them a try! We've got a lot of people who use and rely on them. Admittedly, they need work, but we've got a team of incredibly smart engineers and product folks putting in great effort on that front.
With regards to my practice of SEO - admittedly, I don't do as much as I did for the many years I was a consultant and hands-on practitioner, but I still help a number of non-profits and startups, as well as lots of folks through our Q+A system. Given that this post has generated a few hundred tweets, a listing on Hacker News, several links around the web, 100+ comments and lots of thumbs up, I'd say your assertion that the time spent writing it was wasted is proved false by that data.
One of the best post on seoMOZ EVER!!! Good Job!
And you're a pro member? Seriously?
I am a PROmember? No, I was passing through here by chance :)
It 's just that in this last period I have also seen an increase of spammy results in Google SERP and to see a post on this forum that speaks so well' detailed and clear I was literally lifted the words of mouth