During my first few years in the SEO field, half of the sites I'd visit - those my SEO brethren in the forums or over email owned - were what today we'd probably call "over-optimized." They tended to have features like:
- Keyword after keyword stuffed into the title element of every page
- Overly-lengthy and keyword rich URL strings
- Page filled with "SEO'd" content that was never intended to be a focus for visitors
- Backlink profiles that lacked a single high-quality, "editorial" link
At its best, our profession is about making amazing things that people are asking to see (via their search queries in the engines) and then marketing it in the most optimal ways. At its worst (excluding the crap-hat junk that doesn't even deserve to be called "SEO"), it looks like this:
There's a gigantic gap between this type of "SEO" and the industry's best practices, but the individual recommendations and changes are so subtle that it's not surprising many practitioners go a bit overboard. After all, the process of starting SEO often looks like:
- Week 1: Notice in your analytics that search sends awesome traffic and start optimizing some meta tags (since you heard that's what SEO is about) by putting more keywords in them
- Week 2: See that those changes have had no effect, so begin doing some light reading on the topic
- Week 3: After you've glossed over a few SEO resources (perhaps not necessarily the best ones), start "optimizing" pages by filling the tags you've heard were important with your keywords, changing your internal links to be keyword-rich, placing more keywords on your pages in every conceivable tag and location, maybe even optimizing for some wholly bunk metric like the average keyword density of the top 10 ranking pages
- Week 4: Possibly see a bump in some rankings and, tragically, inheriting confirmation bias, convincing yourself that the strategy has worked and that it needs to be repeated.
- Week 5-20: Struggle with and eventually give up on SEO, or skate by on the fringes with equally poor quality linking practices that get many/most of your pages penalized but maintain rankings on a few
This pattern (or some similar variation) has played itself out in 9/10 stories I hear about folks who've jumped into the waters of SEO haphazardly - and honestly, it's hard to blame them. The engines provide just enough information to keep webmasters curious but unsatisfied. Many of the sites and pages that rank well do actually employ pretty spammy SEO tactics, making it hard for those trying to learn SEO by reversing their competition's success (temporary though it may be).
This doesn't just apply in the on-page world.
If you haven't yet read it, this thread from Reddit - My Job Was to Game Digg - and this one on Hacker News about it, are excellent examples of the perception problem that social media pushes for SEO have caused. This comment, in particular, stood out to me:
As you can see, the web's social voters and contributors have a passing tolerance for the "right" kinds of optimization, but a zealous abhorrence for those that violate their sense of propriety. Even if Google doesn't worry about "off-topic" linkbait, linkbaiters themselves should have cause for concern.
The engines aren't going to take it.
More and more, though, the engines are fighting back against this through changes like the Vince update (and subsequent focusing on brands as a way to sort out the web's "cesspool"). We've also recently seen a dramatic increase in the aggressiveness with which Google will change your titles, descriptions and negatively alter the rankings/visibility of sites that step over this line.
In the long run, it's hard to imagine Google allowing poor results to flourish - especially those who garnered rankings through manipulation. Those sites and pages that follow every single optimization tactic, from internal links to massive keyword focus to "perfect" anchor text in their off-site link building are going to stand out like sore thumbs to the engines. Sites that build pages designed to attract links with little to no relation to the host site will struggle against the biases in the social media world.
And sites/pages that abuse these practices (both on and off-page) are going to have a terrifically hard time earning "natural" links. The organic sectors of the web tend not to link out to those types of sites/pages if they can help it.
It might sound ironic, but there's an art to under-"optimizing" in order to achieve true "optimization."
p.s. Some folks noted they were hoping for a link to some good "best practices for on-page optimization" - here you go!
Rand, in 2000, I wrote a long story called "In Pursuit Of The Perfect Page." It looked at various tools that were designed to help people try get the "perfect" match for what those tools deemed the best pages for SEO. As I wrote then:
Occasionally, I get questions about what "numbers" or "rules" should be followed to construct the perfect page for each crawler-based search engine. In other words, how many times should a term appear on a page for it to rank in the top results? How often should a term be repeated in a meta tag to attain success? How often can a term be repeated before a spam penalty ensues?
No one has these numbers, honest. Those that say they do are merely making educated guesses at reverse engineering the crawler-based search engines. You will always see exceptions to their perfect page formulas. Additionally, the twin rise of a greater reliance on "off-the-page" ranking criteria and human-compiled listings makes focusing on perfect page construction much less an activity than in the past. Those who are looking forward in the world of search engines are not worrying about "keyword densities." Instead, they are building content, building links and doing other activities that will benefit them in the future.
I wasn't real fond of these tools. I felt like people obsessed over using them rather than focusing on the broader picture. My conclusion:
I well understand the quest for the perfect page, and should you embark on that journey, I hope some of the tools I've mentioned will help. Nor do I mean to sound hostile to this quest or those who are making tools to help you. But please keep in mind that even armed with the right formula, your supposedly perfect page may not rank well. And if you've been spending all your time on doorways only to find little or no success, strongly consider a change of tactics. That doesn't mean abandoning doorway pages entirely. It does mean, however, to ensure you are balancing your efforts.
Enlarge your site to have real content on the terms you want to be found for. If you sell shoes, have articles about how to select different types of shoes. If you offer package holidays, provide some tourist information about your destinations. Build this "real" content and optimize it for your target terms. Then go out and link build. Find sites that are non-competitive with yours but on related topics and offer to swap links. These two activities are akin to building a house, while concentrating on doorways is similar to renting. Renting is easy and offers a lot of advantages, but at the end of the day, you don't own anything. Concentrate on building your house, and you should see traffic from search engines and other publicity venues over the long term.
So today, I guess it's kind of sad to read that little has changed from your perspective. But there's also irony. You're one of the chief providers of tools that encourage the so-called "over optimization," aren't you? That's what SEOmoz is all about these days, I thought -- trying to "scientifically" reverse engineer the search algorithm.
The LDA debacle seemed to be a classic example of this. You put out a tool, said it had this remarkable correlation, and people started tweeking their pages based on the tool. Some even claimed success. Then about a week later, an error is announced that turns out to make LDA no more important than a page's Google PageRank value.
A lot of people seemed to have spent a lot of time messing with the latest SEOmoz tool to chase those rankings, which pretty much seems the exact opposite of what you're writing about today.
So what do you suggest people do with your own tools? Ignore them? Understand that they are part of the overall toolbox? Do you need to add warning labels to them? Or do they inevitably cause some people to head down the over-optimization path you warn about, because that's just what some people will do?
Unfortunately, following the SEOmoz tool (and doing things the "wrong"way according to Rand's article here)... still works. Until it doesn't, things won't change much. Hard to say something is wrong (from an SEO perspective) when they result in solid rankings. Sure, the content may be crap and not what anyone wants to read, but if you are being paid to make someone #1, people sorta kinda hafta do what you gotta do without being "too" spammy. Unfortunately, you know you have to toe that fine line as closely as you can without crossing it.
It's definitely something we think about and try to be smart with when we build recommendations into software. With LDA, I was pretty specific in the post that the intent was not about keyword stuffing your pages or getting every page a perfect score, but that it might be helpful to answer the questions of "why is that page outranking me." The illustration I showed in the post of recommended use was, in fact, to use it as a data point for comparison.
LDA, and topic modeling more broadly, to my mind is useful for marketers not just as a way to help show search engines our content is relevant, but also a great tool for reaching visitors. If someone searches for the Rolling Stones, chances are they would love to know about the members of the band, the songs and albums, the upcoming concerts, etc. and those are the types of recommendations LDA could hopefully someday give (and today, does scoring on).
I think part of SEOmoz's obligation to our customers and defined mission is to be able to show the value of any particular SEO tactic with good math and science behind it. Our progress so far is small, but more substantive than what I've seen publicly elsewhere on the web. We're comitted to greater accuracy, better recommendations and bringing more science to the process of marketing. I think that falls in precisely the opposite direction of spam, manipulation, keyword stuffing, and "over-optimization."
I don't see where the advice above not to go overboard on manipulation of keywords or links conflicts with our business of providing recommendations in software and while I agree (and worry) that some folks might take some of our tools too far (as was noted above, the problem of making all your anchor text the same and seeing that in OSE) is a concern, this post seems like precisely the type of "warning" we should be providing.
Danny - Pardon my jumping in here, but I'm sincerely interested in this discussion in the broad sense. When you say "build 'real' content" and "optimize it for your target terms", the next thing any business owner naturally asks is "Ok, what does 'optimize' mean?" So, early on, we all tried to figure that out, and in the late 90s, it had a lot to do with keyword density. Naturally, some people went too far, cheated the system, the engines adjusted, and on and on (you know that history far better than I do).
The problem is, it's still a fair question, and I think it's legitimate to try to build tools to help people figure out what Google wants. It's all a matter of degree. If I use SEOmoz tools (or any others) to help write better copy that puts my pages in front of relevant searchers for relevant queries, that's win-win. If I use those tools to game the system, then at some point a line gets crossed and I'm creating spam. The same tool can both improve search quality AND harm it, but ultimately it's just a tool.
Of course, it's our collective jobs to try to explain the difference, but that difference is highly nuanced. On the other hand, if we just say "optimize your pages", we're leaving a lot of people in the dark. So, what's the right approach? I think we're all revising our answer to that question every day, and I'm not sure what else we can do.
It is a matter of degree, how you use tools much less approach SEO in general. But I think the most important thing to understand is that is is far more art than science.
There are so many factors that are involved in ranking any page -- and factors that might even be applied on a per query basis -- that talking about the science of SEO is somewhat laughable.
As others have pointed out, there's no probably finding plenty of "over optimized" page that nonetheless still rank. Add to that, actually spammy pages that rank. Or quality content that does NOT rank.
For me, the right approach is to build up a site that has quality content, first and foremost. Ensure that it is search engine friendly in how it is designed, applying things that have consistently been shown to work (especially titles). To try and build up quality links from sites that have an audience that I'm after.
After all that, THEN I might wonder if there were little tweeks here and there that clearly seemed to be making a difference. Though honestly, I think you'd still be better off asking if there were vertical search resources you should go after.
SEOmoz, for example, has no "juicy universal search opportunity" tool that I know of. The focus is on web search, even though blog search can get you into the top results -- as can video search, or book search and so on.
I suppose it's the "guns don't kill people" argument. It's not that "perfect page" tools cause "over optimization" or time wasting chasing the algorith. People do that. And any SEO is, to some degree, an algorithm chase.
But I guess what I've seen here, as I've come over more in the past month since the LDA thing erupted, is a sense that some people are getting lost in the tools and the feedback from them. That some, perhaps many, are simply going to far out of balance running all these reports and chasing factors that might not even have an impact.
If Rand's post today helps them strike some further balance, that's great. That was part of the reason I commented in the first place about LDA. The reaction I saw to that was "woo hoo, here's something else we can chase." And I felt for many people, that was a chase in the wrong direction.
I appreciate the deeper explanation, and I can't say I'd argue with any of that. It seems that, as we (all of us) more aggressively build tools, we have to occasionally stop, re-educate, and re-evaluate. It's the nature of people to grab the low-hanging fruit and to go a bit overboard, and if we claim to be "white hat", it's our responsibility as SEOs to not only build the tools but help people use them responsibly.
For me personally, it's not just about Google's rules and even broader search ethics, but helping my clients build long-term success. If a tactic is high-risk and short-term, and I can make them money this month but it could all fall apart in 6 months, that's not a good risk, IMO. Even worse, if I'm tricking searchers and not building real value for them, I'll probably get a client traffic but no sales. That's just bad business all around, and it would make me a lousy consultant.
Lots of pro baseball players were using steroids in the late 90's / early 00's. Sure, the game would have been better, more pure, more "value"... but jeez, those guys that were juicing were sure mashing a ton of home runs. Until baseball came along with better guidelines to clean up their act, that is.
Until baseball (Google) levels the playing field, there are going to be lots of juicers at the top of the rankings. Do you want to be a good law abiding baseball player, or do you want to put up monster numbers and get the big contract?
Sure, doing things the "right" way today will likely be a good move tomorrow or in a month or in a year or in a decade when Google cleans up some of those sites, but try to convince a client to wait that long for the person in the top sopt doing things the wrong way to fall from the index.
And on another note, I can't believe it's been this long since I've posted here. Good to see some old friends still around and active though.
Anyone want to meet up in New York at SMX East next week? Just shoot me a PM if you're going to be there. I've been given two sessions there (shameless plug) and I am looking forward to seeing lots of friends that I haven't seen in a while there.
I generally agree with what you say, Rand, but I have to respectfully point out the glaring contradiction:
The on-page optimization portion of your SEO tool pretty much recommends that we do exactly the sort of practice you parody above. I will admit that my optimization tends to be a bit on the heavy side, but since I started using the SEOmoz tool, it has frequently told me many of my pages are under-optimized.
I consistently get docked for: not having the keyword at the front of the title, not having the keyword at the front of the H1 (or no exact match in the H1 because I'm using a variant), and not having 4 exact matches in the body text (or even having no exact match, because I'm using variants that I know Google considers to be essentially synonyms). Combine this with the fact that each page is optimized for two or three variants, and in order to get an "A" for each variant, I have to make the page look pretty dang spammy.
In order to follow your recommendations in this article (which I think are essentially correct) I would have to shoot for "B" and "C" grades for most variants.
If this is genuinely your opinion of best practices, the tool should be modified to match it.
Edit: W00T, while I was writing this, Danny Sullivan posted basically the same point. Score!
I hear what you mean, and I cannot deny that I too found myself in the same clothes as yours.
But, finally, taking in account also what written in other posts (also here: remember the H1 value issue?) what I've decided is to use the Web App on page tool more as a best practice on page suggestion tool than something to follow without a critical eye.
And the "critical eye", IMHO, should be this simple question: "Do you like to read what you are reading? Is it bothering you?". If I answer yes to myself, therefore I try to quit some the "optimized" stuff I've put in the page in order to better the general usability quality of the page itself and, then, plan the use of other tactics in order to compensate what I took off from a too much on page optimization. Maybe it would mean creating a new pages for that keywords that I cannot optimized in the previous page without sounding spammy or simply ridicolous and trying to make those page useful and linkable (as Danny suggested and still suggests).
So, I don't think it the tool to be blamed, as it does its work saying Hey, would be better to have your kw here here and here, but the one using the tool because he is using it without common sense... that finally is sooo many times the real rule of SEO.
I agree, and I still think the tool is incredibly useful, don't get me wrong. However, does it make sense to have the tool grade as "A" a page that is over-optimized, and grade as "C" a page that is correctly optimized and ranks #1? (I've got numerous cases of that.)
The only reason I would blindly follow a tool is as an experiment to see how well it works. I'm not worried about that. I'm just suggesting that the tool could be improved by tuning down its suggested level of optimization a little, and/or making it smarter about variants and synonyms.
Ok... I follow you... and, yes... maybe that visualization of the grades should be different, in order to avoid the incorrect use of the tool itself.
Maybe it should be calculate grades not by the number of matching positions were the keyword is found, but over a pondered weight of the sum of the position were the kw is found, giving to the presence in the Title a wild card meaning (aka: if the kw is in the Title then the grade can be A. If not it won't be A never).
With a pondered grade, we would see easily if the page is balanced in one sense or the other (over or under).
Rand,
I fully agree in theory, but in the end Google is the only one to blame for crappy over-optimized sites. If Google rewards these sites with top rankings, then people will continue to build them. I feel our role as SEOs working for businesses is to maximize ROI. If giving Google what Google is ranking well maximizes ROI, then that's the game you have to play until the rules of the game are changed.
To me, the biggest reason to not build over-optimized sites has nothing to do with Google, but rather conversion rate optimization. Ranking well is great, but if the site reads like literary stuttering where the same words are repeated over and over again, then users will leave just as quickly as the came. ROI is the goal, and SEO is just one factor alongside CRO and many other factors.
Agreed. One of my site's competitors is absolutely killing it in the rankings, with top rankings in twice as many keywords as us, and it is a spammy, unusable site stuffed to the gills with keywords. While the rankings have remained steady, they've seen a steady loss in traffic, month-over-month. I shudder to think of their conversion rate.
The end-all be-all of SEO isn't ranking. It's quality traffic and conversions.
That said, it can be difficult to convince clients and executives that you're doing the right thing by not acting like those other sites, despite their high rankings.
Rand i truly appreciate your efforts to promote best seo practices in the industry. But their is really no such thing as overdoing seo or over optimization. Your site is either optimized or it is not optimized. Also the danger associated with overdoing seo or the over optimization penalty is more a myth than reality. I personally never had any 'over optimization' penalty even when i deliberately did the so called over optimization in the past. Approving the over optimization theory is like approving the theory of 'keyword density has significant impact on rankings'. Here is matt cutts views on over optimization:
https://searchengineland.com/googles-matt-cutts-on-over-optimization-21471
He has clearly said in the video that "there is nothing in Google that we have like an over optimization penalty for".
Then why over optimization is bad for SEO? It is because webmasters won't link out to a page which looks spammy. So you will have hard time getting editorial links. Why over optimization is bad for business? It is becuase visitors' perceived value of the business/brand may decrease dramatically once they see a page stuff with keywords, developed for search engines and not for them. This will results in little to no conversions. So you are right about the danger but the reasons are different :)
I believe he isn't referring to being penalized by the search engine, as much as not gaining the end user's trust, and over all conversions. Let's face it as SEO's if we visited a website that looked like it was trying to hard for optimization we wouldn't use it either. That's how we have to assume the majority thinks as well.
Edit: I see now you mentioned the dangers of converstion. Making my point moot :)
There are plenty of people who "over-optimize" and do fine, sadly, but we've seen the flip side many times. Massive anchor-text stuffing is probably one of the biggest ones that I've seen result in dozens of penalties. Granted, it's anecdotal, but when someone's home-page has been deindexed and they have 90% of their anchor text all using the exact same phrase in the exact same context, the picture gets pretty clear over time.
There are other, subtler issues with tag-stuffing, too. Here's one I see a lot. People start all of their TITLE tags across their site with the same 10 keywords, adding the unique elements at the end. They end up with two issues: (1) their ENTIRE site is now competing with itself for those keywords, and (2) Google sees duplicate content everywhere. Is it an outright Capital-P Penalty? No, but it can have a huge negative impact on ranking and even indexation.
As Matt Cutts said in that video, "it's a euphemism for spam." Law of diminishing returns also plays into this, after mentioning Hotels Bangalore the first time, each one diminishes in value.
@seo-himanshu I have to disagree with you. No penalty for SPAM (over optimization)? yes there is, alot. And you can learn all about that in the video link you posted. Matt Cutts was talking almost exactly what Rand says. https://www.hotelsbangaloreindia.com is pure SPAM and they're also there because of the domain name, nothing else.
Thanks very much for this Rand.
I thought now would be a good point for me to highlight something that has been coming up in Q&A a LOT lately and I think this point really hammers it home: more often than not in the cases I've looked at recently the people who are asking about specific terms and why they are struggling to rank for them seems to come back (at absolute least in part) to anchor text distribution. I would definitely reccommend that people make use of this metric in Open Site Explorer.
Except in rare cases, a "natural" link profile should not have 10,000 links with exact match anchor text from only 3 domains and 100 links with the brand/URL as the anchor text from 5 domains and expect to rank for a competitive term.
The anchor text links is one area that seems like people have really been overoptimizing of late and just wanted to highlight this one in particular.
In my recent experience "getting the link" when you've been doing linkbuilding is so much more important- you're probably better off leaving the anchor text up to the linker :)
Agreed, but the fact is sheer link volume can still work (even with crap blogrolls). Until Google's algo totally discounts these links, people will continue to use them.
Yes and otherwise you are left out of the game if you don't do it. Its ok if you want to rank after 10 years. And by then Google would be replaced by someone with new algorithm :(
My problem with leaving it ot the linker totally is that they will often at best use your site title, and at worst use View Website. I like to give the linker 3-4 anchor options, and even if they stray from that, they at least see the direction and will probably anchor better.
What's really tough is telling people where to go next, especially when they feel (and sometimes rightly so) that all of their competitors are playing the same game. The problem is, it's a long-haul to revamp your content and link profile and you have to take a long-term view. It also sometimes means risks in the short-term. You might make your tags LESS spammy and actually take a hit in the short-term, which really sucks.
In any business, though, I think the long-term view is critical. Getting search traffic for a couple of months, basing your business around it, and the losing it all can be catastrophic. Better to have a slower climb that lasts years.
Completely agree. I am writing a post about this right now specifically geared towards blogs and comment spam links. I was looking at the link profiles of some highly ranked sites and seeing thousands of links from blog comment spam. These have a high probability of getting moderated or deleted. The short term gains can be great (as in the case with being ranked right now) but like you said DR. Pete, these don't have sustainability. And why invest the time in SEO if it isn't going to last?
OT.If webmaster of those sites did they homework, you would not be able to check their BL footprint in the first place.Other then that I agree with you.
Hi Sam,
If this is the case then why is the Anchor Text Of External Links to the Page counted as over 20% of Google's Ranking Algorithm Components on this SEOmoz blog post in the Learn SEO section?
Hi Romancing-
I don't want to steal Rand's thunder but I think this is one of the many problems we have with an industry that is ever changing and a sizable back-catalogue of posts. It's really difficult and time consuming to update older posts, yet the algorithm is constantly changing.
To answer your question: I would say that this has been a relatively recent change for starters. Secondly, I think the point that it has been "overdone" is the real concern and the ocassional exact-match or near-match anchor text link is great provided it is/looks natural.
The only other thing worth pointing out is that the 20% number you're referring to is actually the result of a survey (conducted summer of 2009) of 72 SEOs rather than a hard and fast suggestion that it accounts for 20% of the algorithm, I think it was meant to imply that 20% thought it was a very important factor rather than being the most important, etc.
All I can say is that what I'm seeing lately is a shift from this (particularly if it's been overdone) and I'll be interested to see what the next survey has to say! :)
The SEOmoz team has been discussing ways to handle some of the older posts/potential issues like these and any feedback to that end I'm sure would be much appreciated.
Thanks for bringing this up though!
"Those sites and pages that follow every single optimization tactic, from internal links to massive keyword focus to "perfect" anchor text in their off-site link building are going to stand out like sore thumbs to the engines."
And that's just the point - search engines have very smart people working to programatically detect the signals that may indicate a spammy or over optimised site. Anchor text links have to be amongst the easier signals to detect. Detune, detune, detune!
SEO's so much more about helping clients understand their environment, while developing great products, websites and content and so much less about the bull-in-a-china-tea-shop, over-cooked anchors, titles and internal-links strategies days of yore. I'm personally glad about that, too. If SEO hadn't changed in the past few years then I suspect I'd be working in a different field by now :-)
"search engines have very smart people working to programatically detect the signals that may indicate a spammy or over optimised site"
Here is the over optimized site ranking no.1 for 'hotels in bangalore': https://www.hotelsbangaloreindia.com/ on google.co.in
I can fill the whole page with such examples if this is the only thing i have to do. Travel, Gambling and porn industry is choke full of over optimized sites, openly breaking every law in the Google book (from doing excessive link exhanges to purchasing large number of links). ..........Smart people?
I'm not sure where you see that site as being over optimized ;)
I think Rand should take a followup and explain one case point of Optimized and one of Over Optimized. Opinions are hard to gauge without visuals. I think a Roger apperance is needed.
Um... well, the title has 10 keyword variants, the first sentence is all keyword variants, the body is full of bolded keyword variants, and the left nav links are all keywords.
If that's not over-optimized, then you're right, I'm not at all clear on what Rand is talking about. I mean, I thought my optimization was a bit heavy, but this is much more so.
So I admit it, I'm confused. Rand, what the heck are you talking about, and how does it differ from correct optimization?
I'd say Himanshu's example above is a very solid illustration of the graphic I made in the post.
Ah! And maybe you have not seen the "Our Network Sites" you can visit from their footer...
OK, then himanshu's point needs to be addressed. This page may be spammy, but it's working. I'm not saying we should make pages that look like that, it's way over the top IMO, but doesn't it make sense that people follow this route when it in fact works?
I talked about this briefly in the post as well - how many bad sites that rank well inspire others to use these not-ideal practices. Over time, the engines filter these out, but in the meantime, they're teaching people to do SEO in a short-term way that harms all the parties in the system (SEOs, engines and searchers).
To be honest, I don't know how we combat this - I think it's up to Google to do a faster/better job rather than taking the many months or even years to fix these.
Of course the problem is that not everyone is trying to build a long-term, sustainable business, so they'll do whatever they can to ensure short term financial gain over long term business, amongst the worst offenders being those adSense sites that offer nothing of any value.
What sickens me most are those link spammers who will exploit any forum (used in the loosest sense) that enables followed links. I recently explored a competitor's link profile to find that they (plus a number of other outwardly respectable businesses) had comment spammed a private individual's website, set up to provide support to people with disabilities. Utterly despicable.
Sadly, no amount of moralising will impact these people, as their 'over-optimisation' is quite deliberate.
I think that one of the reasons we see so much of this kind of over-optimised approach is that its practitioners tend to prey on the ignorance of their clients. You see people selling SEO like some kind of alchemy that only the select few have access to - and because the client doesn't know any better (and why should they?) they're happy to hand over their cash.
I think this is a lot of the problems with the industry still and says a lot about SEO as a business still being in its infancy.
I have taken over 2 accounts from a big "award-winning" agency here in NYC recently and was just blow away at how they (big agency) was just ripping of their clients, keeping them in the dark about what they were doing, buying paid links and not explaining the hazzards of this practice, and just generally providing little service for a lot of money.
I have come across proposals from "reputable" firms that include outdated or flat-out bogus SEO tactics. The worst part is, these tactics are often the same ones many potential clients believe SEO is all about. Do these firms really believe the tactics they are practicing are the way to produce results? Or are they simply telling the client what he/she wants to hear rather than working through the challenge of educating them?
Setting expectations and providing some level of education is inherent to being a professional in this field. To ignore that is where true ignorance exists.
kennfusion, what's wrong with paying for links?
I can't speak for kenn, but this might make for good reading if you haven't seen it before - https://www.seomoz.org/blog/our-stance-on-paid-links-link-ads
That's great....but what are we as SEOs supposed to do with this advice? You say that over-optimization goes against industry best practices. But where is the line drawn? I'm an experienced SEO, have been to many conferences, have many industry blogs on my reading list, and have read a million trillion 'Best Practices' guides, decks, blogs, etc. from respected industry experts and speakers. It's not that I disagree with this post...I completely get your point. I am just at a loss at what we are supposed to tactically do when industry experts, guides, tools, etc. give us guidelines, advice, feedback and results that are somewhat contrary.
And, what exactly is under-optimization? Are you suggesting that by keeping titles purely action-based, focusing purely on content, not having many keyword-rich anchor text links or optimized URLs, etc, etc. and just letting things be we will see top rankings?
Again, I totally respect SEOMoz and have for years, and I get your point, but am at a complete loss about what exactly the takeaway is here...and what practical tactics and strategies we should employ instead.
I really have to agree with this comment...I expected this post to close out with a suggestion of better practices. To be quite honest, my on page targeting is very simple and straighforward...here it is: (I generally only target a page for one KW phrase and let the engines rank me for the variations)
Title Tag: Keyword Phrase | Brand Name
Meta Descrip: I describe the content on the page, add the kw phrase or a variation alongside a call to action if it makes sense
Rel Canonical: <link href="https://www.yourwebsite.com" />
H1: I try to add my KW phrase in one if it makes sense to the look feel of the page, but honestly, you can just use CSS to make it fit the feel of the page without gaming the engines. (Yes I'm aware that this supposedly carries very little weight, but is there strong evidence that it doesn't matter at all? I haven't seen any)
Img Alt text: I put alt text in all my images, but I try to find one where the KW phrase makes sense and add it to the alt text.
Body Text: I try to use my exact KW phrase I want to rank for at least once, after that it is used if it makes sense or there are variations. I don't see what the problem with this is, if the page is about what you are targeting, you are bound to use the phrase and variations of that phrase a few times.
Page URL: I make sure its a clean URL, and if I'm creating a new page or optimizing a page that doesn't have a ton of weight, I'll create a URL with the keyword phrase. We all know that you see KW rich URLs dominating page one.
The only thing I didn't use that is supposedly "over optimization" are mult kw variations in the title, and KW rich footer links.
Would you consider that over optimization? I always looked at that as a perfectly targeted page. On Page is one thing we have a lot of control over and if this is considered overkill, I'd like someone to propose changes.
Meta Description:I leave it empty. I let SE pick it from the content . Same for the keywords.
Great article, Rand. This should be a first SEO article starters read even before they learn what keywords means, so that way they can avoid mistakes in early phase. I wish I could read something similar in my early days. :)
I personally learnt the SEO and then ultimately decided to outsource to a company called Ranking Monster. They do a great job as they have lots of man power and uptodate SEO knowledge. I am better off focussing on other things now :)
Its easy to get hung up on keyword obsession and forget information needs. Let me explain. So you decide to target keyphrase - "Rent office coffee machines". Your research informs you search volumes are good and competition is low. You plunge into on page and off page SEO shenanigans and you hit the launch button expecting anytime soon the MD will be shaking your hand and introducing you to your new office with a city park view. But oh no! your web analytics lets you know there's no one hitting the buy button or get me a quote. Why? because yes you've optimised the page with lots of clever derivations of your key phrase but you've overlooked the information needs of the user, let me explain.
In the example of "Rent office coffee machines" you've omitted to detail prices, machine choices, cool jQuery effects to explore the pictures of the machines" In a nutshell you've overlooked the information needs of the user result no stickness, no one hangs around the page and you've been diagnosed with a first class case of keyword myopia :-(
Good article Rand :
"At its best, our profession is about making amazing things that people are asking to see..."
The first slide in my training deck is titled What Is SEO and then says:
"SEO is the art of connecting your products and services with the people who are searching for them."
To me, this statement encapsulates the desires of all three players in SEO, the people searching, the site trying to be found and the search engine itself.
The comment of Himanshu is an interesting view of the "over optimization", as it points to the "psychological" effects of having a site over optimized.
But finally I believe that both of you are saying the same (do not exceed with SEO) but from different perspectives.
Over optimization is something we did all who are here. And surely is something people who starts with SEO on his own do also because - on the short run - it helps in gaining good rankings for non competitive keywords using simply On Page SEO (and that's because a great % of websites are still not optimized at all). And then people starts reproducing the over optimization tactics with link building (Sam above explains it very well, and it's a tricky tactics that can lead to penalization for being unnatural). The result: a website that risk to be devaluated over time both for penalizations (if exceeded too much incurring in Keyword stuffing and not editorial link profile) and because not able to increase the number and quality of its link.
But, what is the real reason of this hyper zealeous way of SEOing? It is not to focus the correct objective, which is not to be first in the rankings just to be first... it is to be first in the SERPs in order to have the biggest volume of visitor to put into the conversion funnel. But if the conversion funnel (and here where Himanshu is perfectly right) is totally messed up by over hiped SEOed pages, like rocks blocking water pipes, then it will be very hard to obtain the ROI that the sites is supposed to have. Simply, the bounce rate will be expectacular because of people will run away from a site that nothing says to him then "keyword, keyword, keyword and another keyword".
And that is the main complaint I receive from Client coming from previous bad in house or bad SEO services: We have xyz visitors but we do not sell anything.
What really stinks though, and will continiue to stink, is that a lot of times that "crap" still works. Yep, would be a perfect world if everyone were providing value. But most of the time, people want shortcuts without taking the time to add in the value part. When I'm building a site for a private investigator, and he sees the guy in the #1 spot with the crap page you defined above, then the client wants the crap page because it is working for the person in the #1 spot. It's tough to argue that the "right" way is better when the "wrong" way is #1. You're preaching to the choir here, but a lot of times, the crap floats to the top. And geex, I really don't want that to be taken literally as that is a very disgusting mental image.
I find this article very interesting, since I am sensitive when it comes to keyword stuffing over great content. Overdoing SEO means that, most of the times, the quality of the content presented has to suffer from keyword stuffing and so on. There are also those type of people that believe SEO is a one time deal, and so they overdo it and put all their "knowledge" into it at once. So I'd say that one of the disadvantages of overdoing SEO is also leaving a bad impression on the visitors on your site, not only on search engines.
best seo quote I've heard in awhile "It might sound ironic, but there's an art to under-"optimizing" in order to achieve true "optimization." " and a agree
How many times must it be said that creating good content is the only way to long-term SEO results? I swear most people see all the work that's involved, give up mentally, try a bunch of tactics that don't work, then rinse and repeat when the flavor-of-the-month tactic didn't work.
Spot on!
Ideally you'd want both spectacular unique content of high end-user utility & solid, well-throughout strategic optimization of on-site elements and primed link exposure.
Two thoughts:
Your metaphore can lead to a new definition... the Porn way to (bad) SEO.
SEO Porn....MMmmmmm
This is a very interesting discussion and one that I've been thinking about for a while.
I guess I feel that as long as you ensure that your content is good and interesting to users (the kind of thing that will naturally be shared, linked to, Facebooked/Tweeted etc), then ensuring your targetted keyword is in the title tag, h1 tag and in your body content at least once should be fine.
I agree that if a site - like your example - essentially spams all possible tags with their keyword and different variations, then this is clearly wrong.
But doing keyword research and then ensuring your page is well optimized is something I think is fine. (And as others have said, SEOMoz actively promote heavy optimization via their paid SEO tools, which is somewhat confusing in light of today's blog post?)
The other thing to consider is that - unfortunately - Google still loves this sort of thing. As Jill Whalen said in her open letter last week ("Dear Google...Stop Making Me Look Like a Fool!"), Google seems to *prefer* unnatural sites with unnatural backlink profiles. Certainly, I've seen plently of massively over-optimized (to a spammy extent) sites with EMDs and close to 100% perfect anchor text, and Google reward them with a number 1 ranking. This doesn't mean everyone should spam, although Google has shown no signs that they're working to clamp down on this (if anything, I feel it's got worse in 2010).
It's a tricky issue though, I definitely acknowledge that.
As above, my overall opinion is that as long as each page has good content and delivers *value* to the visitor, ensuring that each page is optimized is a good thing.
My thought when reading this is not "Well heck, I shouldn't overoptimize!" - it's how does Google determine overoptimization, and how do I avoid it? Google can't possibly detect what qualifies as a keyword and what doesn't, can they? I don't believe so - perhaps they can determine terrible grammar in URL and title tag structure, but do they have some algorithmic implementation that punishes it at scale? I doubt it.
I think what they do notice is something like title tag spam, overclogging title tags with similar keyword strings for different products, making it fairly obvious you're heavily tweaking them for SEO. I.E. if you have 1000 pages on a site and 980 are the same title tags with 5% variation for each geographic location/product, you're showing clearly that you're creating an SEO-saturated site. But if you can diversify, even while maintaining keyword saturation (while aligning the title tag order differently), I doubt that Google can detect and/or penalize such an action.
"Google can't possibly detect what qualifies as a keyword and what doesn't, can they?"
Of course they can, they have traffic stats, they are in the best position of anyone to determine what is a highly-searched keyword and what is not.
I've definitely seen cases where over-optimizing for a high-traffic term will get me bumped temporarily out of the top 100 while Google sorts out that I'm not spam - and then I pop right back in, usually at a higher rank.
They can SEE it but are they using it? For the same reasons it seems that the web-spam team doesn't seem to be making case-by-case penalities, making algorithmic changes on a keyword-by-keyword basis doesn't seem to scale..
I have no idea if they're using it, but they can certainly tell on an algorithmic basis, and I don't see why it wouldn't scale - a very simple logic would be IF (traffic(keyword) > X AND keyword_optimization(URL)>Y) THEN (filter_out(URL)) -- Google could easily scale much more complicated logic with their infrastructure.
I'm convinced that the traffic level for a keyword is a factor in the algo. Not sure exactly how, but high-traffic SERPS are treated differently from low-traffic ones, and I suspect this is one of the differences.
It's kinda a mechanical turk argument - although algorithmically they could, are they? I definitely agree that it's possible, I just don't think it really makes sense for them to do. Are they randomly going to completely change the algorithim for keywords that fall off this ranking scale?
I think it's more far likely that you perceive algorithmic differences because the competition is so high, making minor fluctuations in the bigger algo/your pages causing a huge change in your ranking, rather than this occuring on a SERP by SERP basis - at least as determined by keyword "traffic".
What makes things more interesting is that Google tries really hard to make search engine better and smarter every day and more they try it gets easier for sites without good or any SEO to rank higher.
Let me give you perfect example https://www.unitedautoinsurance.com this site is ranking great for numerous search terms for last 3+ years (I know because few of my clients are in same business). Some of the search terms: Indiana Auto Insurance and Chicago Auto Insurance.
Do you know that the same website doesn't have meta description anywhere on their site, no meta keywords when they were in use, same (way too long title) on their home page as well as entire site. Content? well not really much of that either, go ahead and see it yourself, look at the source code as well.
Now I can understand what SEO-Himanshu was talking about earlier.
Content is always very important to me, but backlinks are still far more superior over any content out there and in some cases like my example above, nothing really makes any sense at all.
I am not even going to touch Google Places where legit businesses are being rejected and spammers with kall8.com toll free numbers and randomly picked downtown address are all over Google Places with many "adopted addresses" not just one, but that is the story for some other time.
One thing for sure this is that current time belongs to spammers and there is very little we can do about to change that.
From a very fast view using the MozBar and OSE, what I can say is that generally most of the websites in the 1st page for the KW "Indiana Auto Insurance" are not perfectly optimized (say it so).
You say that the competitor site is not well SEOed... well, surely it has defects (the Title being the worst...), but in order to be ranked in the 1st page just with the index it doesn't matter (I know, hard to understand, as they loose so many opportunities).
But as far as the objective is to have the home in first page with that KW, therefore the site is optimized:
That means that it's a site that has surely things to be fixed, but we cannot say it wasn't SEOed enough. On the contrary it wouldn't not rank I it does.
Yes, I completely agree with you, like I said they rank high where they not suppose to, that was all. From overdoing to not doing at all and the site is top for over three years,
Thanks,
Emil
A healthy debate. A lot to think about.
Does the Anchor text really matter, isn't it more important to look at who is linking to you in terms of their authority and original content.
Chris - good call. Yes, IMO. Hugely.
Anchor text matter quite a bit. Google views anchor text as the best description of a website.
From the new Google's beginners guide to SEO (released today) and about anchor text:
The anchor text you use for a link should provide at least a basic idea of what the page linked to is about.Avoid:
Then...
You may usually think about linking in terms of pointing to outside websites, but paying more attention to the anchor text used for internal links can help users and Google navigate your site better.
Avoid:
Take yourself your conclusion...
Great post Rand - thanks.
The biggest issue for smaller SEO's like me is that so many of our clients competition are over-optmising in this way and succeeding. Especially the point on anchor text - widely abused and rarely punished - or if it is punished it takes months for the engines to get onto it, all the time other sites winning traffic unfairly.
Yep. I've seen a lot of anchor text abuse going on as well. And of course it's the same as with any spam tactic...Google can't go overboard in penalizing it, because then people would go out and anchor text spam their competitors for results.
I don't want to spread any false hope (I too suffer this problem with anchor texts), but just recently I've seen various posts talking exactly of this issue and how it seems that something is changing and that anchor text as a factor is start getting less A-level.
Sorry, I'm not in the condition to recover the URLs of those posts (if any can, please share them here in this thread), but surely an experiment in two parts that should have to be done is about how relevant now it is now the anchor text value and how much in 6/9 months, to see if the voices are telling the truth.
Anchor text comparison - sounds like a project for the guys at SEOmoz!!!
I can see there is a tipping point with Spam penalization and abuse will be phased out at some point by the engines.
How will this effect anchor text for exact match domains that have pure keyword URL's - surely if you have a load of exact match anchor text linking back to exact match domains the engines have to let this slide but these guys are the Spammy-est of them all!!!
Thank you Rand for this article ! And the comments of every SEOmoz members/readers are really great also, to understand exactly what troubles the over optimization is generating.
Do you think Google will be able to develop new algorythms or new rules to fight such over optimization and give penalties ? It seems like nothing is on the road yet...
I think we need to be careful here and make it clear between the diferences of
a) Overdoing SEO
b) Using SPAM techniques for SEO
I personally don't beleive that you can over optimise a web site, but more so the examples given are more of a SPAMMING nature.
It's important that your sites are built for visitors, not the search engines. By creating a page stuffed with keywords you might get a temporary high in the rankings, but instantly turn visitors to the close window button as your site content makes no sense to them.
SEO isnt just about rankings, its about gaining prospects and turning them into sales.
In response to a post above, SEO certainly is a testing process and ultimately no one has control over anything as it's all in the hands of the search engines.
Some thoughts regarding: " Overly-lengthy and keyword rich URL strings"
We've been hearing this more and more lately. What I'm wondering about is how to go about this specifically. How long is overly-lengthy? For example WordPress takes the title of the post and pulls it into the URL. Those URLs can be quite long and keyword rich depending on what the title of the post is. So is this a bad thing? We all see many blog posts with these long (some of them very long) keyword rich URLs performing quite well. Sure, I wouldn't write my title to be Keyword keyword keyword...
My approach is to have the keyword focus (without keyword stuffing) but write for the humans visitors so that it makes sense, is compelling and will give them what they came for.
I use Wordpress as well for developments, and having a "longer" url isn't really the issue, it is things like stuffing additional keywords in there. The titles and URLs of some SEOmoz posts can be pretty long, but still relevant. I saw a great example of over-optimizing below but I took out the incriminating details...
www.somedomain.com/city-state-video-production-website-seo-interactive-marketing-company-nameofcompany-contact-us.php
This was the contact form page. You can imaging what the home page title looked like. This is what I am seeing from people that think they know how to "optimize" a website and is over-optimized. The goal here is to include keywords where they are relevant and still keep the main focus on providing a good experience for your user.
Good points as always. From a non-technical standpoint, I think the overall point is to try to have things look as natural (not spammy) as possible.
If all your title tags, h1 tags, etc. are overly stuffed with keywords and your backlinks all use the same exact keyword anchor text, it definitely doesn't look natural; it looks spammy. Still, as you mentioned in your earlier post on webspam last month, many of these 'spammy' sites are actually ranking quite well so it's easy to see why people do 'overdo' their SEO.
Hi Rand,
How are you doing???
I feel that the above information provided throws light on seo industries worst practices like keyword stuffing, spamming, over keyword density, low quality content, how can you loose relevancy, industry worst practices e.t.c. e.t.c.
That is great. I really enjoyed reading it. But, I honestly feel that the TITLE you chose for this POST has less relevancy to the INFO/CONTENT you provided.
After reading the TITLE, I thought if you overdo 'good' SEO to a website, it's going to harm for that website. But the info given is exact opposite which is closer to black hat techniques.
P.S I really love reading your posts. This is just my opinion about this particular post.
I was reading through this post and thought I'd give some of my thoughts...
I truly believe there is more of a focus on brands but not totally in the sense most people might look at it. I believe Google (and possible other search engines might be taking after their lead) searches out and awards you some "brand points" for ever reference back to your website be it your companies phone number, email, physical address, etc... These then affect your overall ranking source card. But not just that but I think social media pages that are owned by your website/company and reference back to them, this also counts as some brand points.
So I think some of the misconception I see when people look at this is they think brand has to do with stuff on there website. And yes you should have your company name, phone number, email, physical address and such on your site (it's need for SEs to check which sites are forwarding your brand) but branding is more of an off site thing. You need to get your name and info out there across the web so you look like an actual legit business that you are.
A very interesting discussion indeed. Thanks Rand for bringing up this topic.
SEO as Matt Cutts had mentioned in one of the SEO videos is like polishing your resume as you try to highlight , focus and put forward your education, acheivements and preferences in your resume , similarly you try to make your website rank high by making every possible applicable aspect of the website presented in a form that the bots can just not neglect it .
Content and keywords are the 2 main factors. There was a time (7-8 years back) when the repetition of keywords in the content was important but there was an art involved in presenting that content though now it no longer matters . Earlier also when the content had keywords repeated for the sake of SEO it was considered as spam and could be deemed as over optimization.
But now it is the quantum ,quality and frequency of content which is relative , fresh and informative which is more important so now the focus has to be on that. Such content can never be considered as spam or over optimization if it is genuine and original.
The keywords are still the backbone of any SEO project and having keywords in the anchor text, content, Alt text, file names, image names, URLs is important but if you over do it in any way you end by like bragging in your resume which surely is going to be a reason for rejection (In this case by the search engine spiders).
As we always say that the website is for visitors and bots. The optimum balance which pleases the visitor as well as offers the information to the search engine spiders is the state of optimum optimization. As keywords in the anchor text, content, Alt text, file names, image names, URLs not only give more info. to the spiders but also offer relevant info. to the visitors. There is a lot of art involved in doing this too.
As SEOs we have to focus on understanding the science behind the algorithms and develop the art of applying it on the websites for SEO. This is an ongoing process as the algorithms keep on changing and we too have to be creative in our application methodologies. There cannot be a hard and fast rule for every aspect of SEO. As what is applicable today may not be applicable tomorrow.
"There was a time (7-8 years back) when the repetition of keywords in the content was important but there was an art involved in presenting that content though now it no longer matters ."
No longer matters...
Please explain.
It no longer matters if the keyword is repeated 'n' no. of times in the content.
If the content is relevant and informative related to the keyword then it has a possibility of being ranked even if the keyword occurs just 2 -3 times in the content .
Now the emphasis is on fresh, relevant, informative and original content rather than only keyword rich content.
Very good post about the danger of overdoing seo .Really Search Engine Optimization (SEO) is used to help you rank better in the search engines; however abusing it and over doing it can yield very different results.
Great article.it is true that sometimes we over optimized our site to get better rankings in the SERP's. there should be good balance in on and off site techniques which we appled.This post is very much useful.I would like to say thanks for this.
Amit
Rand, great points. I'm not an SEO professionally, but rather out of necessity (the SEO we had hired before did an incredibly poor job, so I took it upon myself to learn the basics). Nevertheless, for a beginner, it is easy to over-SEO a site. I know firsthand that initially I did too much which may have seem spammy (i.e. including KW on url's even if not necessary). Anyway, this type of post helps us refocus on our ultimate objective. Mahalo.
This is a terrific blog post. It really explains the dangers of too much S.E.O. and that less is really more.
I've seen this issue compounded when a group - typically on a forum (think a specific print on demand, affiliate programs, shopping carts etc) get together and fuel each others bad SEO advice. There may be one or more person doing really well, giving out advice that they beleive is the reason for their success, which actually isn't, but all the others take the advice and run with it. I guess it's the blind leading the blind.
This post makes me want to believe that spammers aren't winning, but the sad fact is that I think they are.
SEOmoz has some of the worlds best SEO software, and community, and sits at right about rank 80 for SEO software, based on my geolocation and various factors. Should they be number 1? Yes.
We need change.
It really is a testing process this SEO, and totally agree over optimisation can hurt, it is finding that right balance and then Google will change their algorithm :-) I guess that's why I like the challenge. LT
Fortunately, if you dedicate yourself to reasonable TITLEs and TAGs and focus on real, valuable, fresh, unique content in your pages... Google rewards are around the corner. We learned a lesson recently, when we noticed that we'd dropped to #87 for our market. Site updates and blogging catapulted us to #25 within 48 hours.
The real frustration comes from Google's incessant algorithm experiments. We've had several client sites suddenly drop from #1-3 on page 1 to page 15. Then magically they pop back to #1 after 5-7 days... with no help from us. We've seen this erratic sorting in organics, Product Search and AdWords over the past 2 years. All you can do is keep on keeping on.
Now we're focused on Bing like a mongoose on a bag of cobras. If Bing can continue increasing it's 25% search share, since the Yahoo deal, to 40%+ by intensifying it's Facebook relationship... all of us will be scrambling for MS SEO manuals. When Zuckerberg wakes up and fully integrates Bing results (beyond the current measly 3 below the FB page results) into its search tool, we could see a search windfall Bing's way as 500,000,000 Facebookers flood into Mr. Gates' backyard.
Rand unfortunately those abusive-spammy tactics do work if your strategy is short term. I´ve had clients in certain “$$$ industries” asking to rank their site for a keyword for a few days to make tons of $$$$$... Of course it works with very aggressive tactics.But if you´re going for a steady, long term strategy avoid those practices, you´ll always get caught.
Very cool post. After following some competitor links I can say first hand there ar some really interesting decisions being made by some companies SEO. I would also like to congratulate you for that last line:"...there's an art to under-"optimizing" in order to achieve true "optimization."" Well said indeed.
It's true that over-doing SEO don't look like "natural" and look spammy. The "spam" tag is hard to remove for the client. Thanks for this overdoing warning :)
I can't wait to show an old SEO partner that diagram! It's exactly how things used to "be" (or so thought everyone he was working with at the time).
It's funny; there's so much innovation in this undustry, and we're all such action-oriented forward thinkers, that we all feel like we're never reaching the point we want to reach, but this post (and that diagram of how so many used to see SEO) makes it clear that we've come a long way...and that feels awesome.
"Those sites and pages that follow every single optimization tactic, from internal links to massive keyword focus to "perfect" anchor text in their off-site link building are going to stand out like sore thumbs to the engines."
Rand, What are the guildlines to not overly perfecting the anchor text? From everything I read, including here, anchor text is hugely important, although you should very the text.
Thanks (._.)
SEO may still be in it`s infancy but it is becoming a more viable way of advertising websites and pushing products. Although, there will always be those that insist on shortcuts and not adhering to the best practices available but I`ve found that working in a structured way with exact standards really pays off in the long run and can bring great results.
I really like the underlying thought of this post - that SEO is about good and demanded content findable for the users. On the sites I work for, we have spent the last three months making everything less optimized (both links and content), and both rankings and conversion rates are improving significantly.
On my work station I have a great sticker from Jane And Robot saying: Design for people, be smart about robots = lasting success. I think this slogan is more precise than ever.
this is getting into the warm and fuzzy area of SEO (as so much of it is) but so much these days is related to usability, directly and indirectly.
bravo.
All good points rand I do not think it is wise to over do SEO at all, I have seen it with my own sites where I go too crazy with SEO yet then on sites which are not that great on SEO yet all the basics are in place seem to rank more effectively.
This is what I'm taking from this post and subsequent comments:
Don't over-optimise on-page. You can see if you've done this by working on a site, leaving it a week and going back to it. You either have or haven't. If you have then can you use some of the work on a larger number of pages? If not, lose it. See how it affects things.
Don't be too keen on one or two SEO Tools. Use a broader range.
Much of the advice Google give is 'fair' advice. Guidelines. Not rules. Don't get carried away.
My take from this is
1. Googles algorithm still has a long way to go.
2. SEO's should write content for users which contains natural variations of keywords which helps to promote convertible long tail traffic.
3. SEO's and site owners who over optimise as is the case with the hotels site, will ultimately fail.
4. Find a good mix when doing on-site SEO and don't over do it.
5. Content is king. Ouch! Sorry, just had to say that.
For me, long tail traffic is still the winner and in which case means you don't need to take the spammy approach either on-site or when link building and anchor distribution.
Very interesting read and completely agree with this. SEO certainly can be overdone, as you rightly point out, but a bit of careful reading up through blogs like this one can hopefully point people in the right direction in terms of what to do and what not to do. As you say, less is often more!
Really good article and some great comments.
I just need to say something to the folks complaining. How can anyone blame the SEOMOZ tools for contradicting Rand's advice and insights in this article? If you are using an SEO tool built on an algorithm to completely define your SEO strategy then you need to take a step back from all the tools and start using your brain. Use your own common sense to make decisions about Keyword density, and if you don't have the time, well dump a few clients, because you are very possibly contributing to the cesspool.
Sorry if that was harsh.
I wish every current and future SEO client would read this thread. It really captures the challenge and difficulty of SEO and online marketing in general. Balancing good SEO tactics, conversion rate optimization, targeted marketing messaging, and usability in a manner that optimizes profits is no easy task.
Rand, when are you launching that comprehensive, all-in-one online marketing success and profit optimization tool?
Another great post from you. Thanks for sharing this. Honestly, it's so sad that there are people who overdoing seo so I don't understand them. There are a lot of things might happen if you do that like your site might be banned in the search engines so if that happens then absolutely your effort is wasted and your site as well. To those who are just starting up maybe you try to readthis article fully because what he shared here are all true. Make sure also to maintain the quality of your content because that is still the key to bring tons of traffic to your page.
In Finland there are many "fully-optimized" sites and it seems that they are still doing fine in SERPs.
I'd like to have a button in search results to report over optimized site for Google :D cuz it's so annoying to see for example all possible keywords in title-tag.
I have read the article which you have posted here and Now a days SEO trends like going higher and higher.. All the things you have shared here of the overdoze of SEO that is truely correct.. I really appreciate the effort of yours..
I have been wondering if I am subject to over-SEO-ing... My personal page, located on my SEOmoz profile page, may be in danger of this practice.
If you're talking about this page - https://www.easysafetyschool.com/ - I'd disagree. I think it's well optimized for visitors and engines and I don't see any signs of going overboard (granted, I took only a cursory glance).
That's very good to hear. Thank you Rand.
Under-optimized is the new optimized!
under optimized is just under optimized (that phrase was ironic).
Well optimized is to balance between over and under. Too much would be good only for bots (until it is not banned, maybe), too less would be not seen by bots as relevant for the keyword you want to rank for. Equilibrium is the key.
So is an "overoptimised" page a case when users don't like it and bounce, don't convert etc. ?
I guess it's like a really good promo that gets people to call up and when they do the call centre staff are really rude/incompetent
As everything in life I won't put it so simple... as life things have pattern. But what is sure is that over optimized websites are likely to not receive all the conversion the rankings they could have earned in the SERPs should make preview.
I mean, people like clear information... and a over optimized page cannot be called a clear informative website.
So what you're saying is there is no such thing as over-doing SEO, but it can get to a point where the page will look like spam and therefore it will no longer be effective? It makes sense as you can only do on-page optimisation in moderation without seeming spammy.