A few weeks back, Stephan Spencer (one of my Art of SEO coauthors) authored a post for SearchEngineLand entitled 36 Myths that Won't Die But Need To. I certainly recommend checking out the post, but be warned of some highly contentious comments. The tweets and offline feedback were similarly up-in-arms and it's easy to understand why.
SEO is a field where reputation is a huge part of your ability to perform well. Because the search engines don't publish comprehensive guidelines (or even guidelines that cover 1/10th of the material necessary for good SEO work), businesses rely on the savvy of individual consultants, contractors and employees. If your boss reads Stephan's article and sees him contradicting advice that you've been giving for years, faith erodes and with it, job security. Luckily (or perhaps unluckily), there's probably 5-10 articles you can find on the web that support your side of the story, many from quality, trusted sources.
The lack of standards sucks. But, it's also the reason our industry is so exciting. New experiments & experiences can reveal critical data about search engine operations. The ability to become an expert is open to anyone with the skills and perseverance to see it through. But, no matter how hard you try, it's hard to overcome some of the persistent myths of the SEO field - I've been caught in plenty of them myself (and who knows, maybe still am today).
This post is going to look at some of those nagging, lingering falsehoods that continue to thwart good SEO efforts, specifically those that Stephan called out and faced strong resistance. As always, this is my opinion, based on my experience (see the moz disclaimer) except in cases where research and data exists, in which case it's my opinion that the research cited is good enough to warrant that opinion :-)
How Significantly Does Personalization Affect Rankings?
Stephan Says:
Although it is true that Google personalizes search results based on the user’s search history (and now you don’t even have to be logged in to Google for this personalization to take place), the differences between personalized results and non-personalized results are relatively minor. Check for yourself. Get in the habit of re-running your queries — the second time adding &pws=0 to the end of Google SERP URL — and observing how much (or how little) everything shifts around.
Comments Include:
I’m not sure I agree with your statement under #5 that personalization changes are “relatively minor”. I’ve been seeing some drastic rank changes due to personalization. I just posted about it at https://www.rypmarketing.com/blog/49-are-google-serp-personalizations-relatively-minor.whtml While there are still “absolute rankings” that display most of the time, your site can be ranked much higher or lower, based on personalization.
My Opinion - They're both right. Personalization seems to primarily affect areas in which we devote tons of time, energy and repeated queries. This means for many/most "discovery" and early funnel searches, we're going to get very standardized search results. It's true that it can influence some searches significantly, but it's also true that, at least in my experience, 90%+ of queries I perform are unaffected (and that goes for what I hear/see from other SEOs, too). The linked-to post above actually helps to validate this, showing that while rankings changes can be dramatic, they only happen when there's substantive query volume from a user around a specific topic.
Do We Need to Update Our Homepages Every Day to Maintain Rankings?
Stephan Says:
"It’s important for your rankings that you update your home page frequently (e.g. daily.)" This is another fallacy spread by the same aforementioned fellow panelist. Plenty of stale home pages rank just fine, thank you very much.
Comments Include:
It actually is important. Sure, a stale home page might rank, but Google definitely takes freshness into account in rankings. I’ve seen rankings boosts whenever I post new content.
This varies from niche to niche, of course a site can rank well whilst remaining static, it may also have a considerable number of links pointing to it. In a competitive niche where the link volume/quality is pretty even, then regular updates to the home page, and other pages within the site can make all the difference – to describe this as a fallacy is a fallacy itself.
My Opinion - There was a time when I was pretty convinced this was true. I did lots of testing around it for my clients sites and would put in time each day making sure new content appeared on their homepages. Today, I'm much less of a believer. Stephan is certainly correct that plenty (if not the overwhelming majority) of homepages and, indeed, web pages that rank well for many queries are static. I do think it's a great idea to continually have new content linked-to from homepages - by linking to the latest blog posts, YOUmoz posts and marketplace postings, the SEOmoz homepage helps drives spiders to revisit frequently and crawl these new posts (though RSS pings may make that obsolete).
Overall, I wouldn't advise updating pages just for the sake of possibly getting a "fresh content" boost. QDF operates on unique, fresh, individual pages (or older pages that are earning newly fresh links). I'd have serious doubts as to whether anything in Google's ranking system rewards pages that simply change frequently - it doesn't pass my smell test.
How is Google Treating "Reciprocal" Links?
Stephan Says:
Trading links helps boost PageRank and rankings. Particularly if done on a massive scale with totally irrelevant sites, right? Umm, no. Reciprocal links are of dubious value: they are easy for an algorithm to catch and to discount. Having your own version of the Yahoo directory on your site isn’t helping your users, nor is it helping your SEO.
Comments Include:
Google places less weight on reciprocal links that they used to, but they still count. I’ve done numerous link exchange campaigns for websites, and seen huge boosts in rankings. At the end of the day, would you rather have a reciprocal link from another site in your niche, or no link at all? The answer is obvious.
Reciprocal links aren’t necessarily of dubious value. Consider this example:
I’m a news site. I link to CNN because it’s CNN and they have news. One day, CNN links to me (huzzah). Technically, this is a reciprocal link, but no way in hell is Google going to discount the value of the link because the sites are linking to each other. So now you have to determine intent — and how do you do that?
In many niches, every authority site links to every other. Not only is it natural, but these are the most relevant possible links. So what you seem to be saying is that Google lowers the value of a site’s most relevant links — thereby increasing the relative value of irrelevant or off-topic ones. That makes sense how?
My Opinion - This one really depends on how we're defining "reciprocal links."
The post you're reading links to Stephan's SELand article. Would Stephan updating that post with a link here potentially hurt both our rankings? No.
However, if SEOmoz built a link directory on our site (ironically humorous because, as long time readers may recall, we used to have one) and promoted linking to your site if you reciprocated with a link back here, I'd be more concerned. This is essentially link graph manipulation and while it's a fine line to tread, plenty of folks have crossed it in the past and, as Stephan notes, unnatural reciprocal link behavior is remarkably easy to spot on a link graph.
I wouldn't be concerned at all with a technically "reciprocated" link, but I would watch out for schemes and directories that leverage this logic to earn their own links and promise value back to your site in exchange. Also - watch out for those who've evolved to build "three-way" or "four-way" reciprocal directories such that you link to them and they'll link to you from a separate site - it's still attempted manipulation and there's so many relevant directories out there; why bother!?
Keyword Density is Not Used - How Many Times Do We Have to Say It?
Stephan Says:
Keyword density is da bomb. Ok, no one says “da bomb” anymore, but you get the drift. Monitoring keyword density values is pure folly.
Comments Include:
Folly? Hardly. If you’re trying to rank for a keyword, you want to make sure you use it a few times on a page. That’s just common sense. Of course, you don’t want to overuse a keyword, or it might come across as spammy. Any smart SEO pays attention to KW density.
My Opinion - Again, we're likely coming down to semantics. The formula for keyword density - a percentage of the total number of words on the page that are the target phrase - is indeed folly. IR scientists discredited this methodology for relevance decades ago. Early search engines and information retrieval systems already leveraged TF*IDF as a far more accurate and valuable methodology.
In my opinion, the reason the myth persists is that sometimes, optimizing towards a keyword density can actually improve your relevance and targeting of TF*IDF. I'll make an analogy - let's say you believe flight is accomplished not by lift, thrust, drag and weight, but rather by reaching a particular velocity in a bird-shaped device. It's entirely possible that you might stumble upon flight, or flight-like elements even without understanding the physics. That said, could you honestly call yourself an aeronautics engineer?
If we're going to call ourselves professional SEOs, we should bother to learn the science. Yes, adding additional instances of a keyword term or phrase to a page might indeed help your rankings (usually not massively and almost never in highly competitive spaces), but that does not mean that the keyword density average you've been using is accurate or that engines leverage the metric. Spreading this ignorance of math and science does little to further the SEO field's reputation - let's end it.
Do Hyphens in Domain Names Really Suck for SEO?
Stephan Says:
Hyphenated domain names are best for SEO. As in: san-diego-real-estate-for-fun-and-profit.com. Separate keywords with hyphens in the rest of the URL after the .com, but not in the domain itself.
Comments Include:
Hyphens in domain names are less than ideal for flagship businesses because they’re hard to communicate, but you better believe Google ranks domains with keywords in them highly, even if they contain hyphens. Again, it’s less than ideal (a hyphen-less .org or .net is preferable to a hyphenated .com), but if the top choices aren’t available, a domain that includes a hyphen can be a decent substitute.
Don’t make a blanket statement that having hyphens in your domain hurts your potential. This is just fallacy. Yes, hyphens suck for direct traffic, as the domain is more likely to spelled incorrectly. But when it comes to search, domains with hyphens in them do just fine.
My Opinion - They suck. Yes, I realize that technically, they may not have a formal algorithmic component (though I'm guessing part of Google's spam filter early warning system does look at hyphens, particularly when there's more than one in a domain name). But, they certainly correlate with worse branding value, which means fewer links and citations, less reputation in the eyes of visitors and potential business partners, less viral spread through word-of-mouth and, as the comments note, lower type-in traffic.
All of those are going to have a 2nd-order impact on rankings through metrics like inbound links, social mentions and usage data (to whatever degree you believe that mya be a signal). Thus, hyphens in domain names do, indeed, suck for SEO (and lots of other stuff). I've never liked SEO practices that operated in a vaccum or didn't consider usability, virality, positioning, branding or other basic marketing techniques. Going back to the analogy above, it's like the aeronautics engineer who doesn't consider seats a necessity. Sure, it flies, but who exactly will pay for a ride?
Does Click-Through Rate Matter?
Stephan Says:
The clickthrough rate on the SERPs matters. If this were true then those same third-world link builders would also be clicking away on search results all day long.
Comments Include:
Don’t assume that clickthrough rates don’t matter just because of some potential abuse that would happen if absolutely zero logic were built in.
In regards to CTR influencing rankings, there are a number of things that lead me to suspect that user behavior does affect search results.
I’m sure you are familiar with the so-called google \honeymoon period\ that seems to occur when a new site launches. The site will rank highly for a few weeks, and then see a dramatic drop in SERPs. I’ve launched over a dozen sites in the past year, and have noticed this pattern.
I believe this goes beyond QDF, it’s a site-wide phenomenon. The hypothesis is that Google will temporarily rank a new site highly, to see how users perceive the site. If people visit the site, and then immediately hit the back button to return to the SERPs, that’s a good signal that the site did not meet the needs of the user, and that google should not rank it as highly.
I am on the fence, I could literally flip a coin whether it is myth, magic, or the CTR really does make a difference. If it does it is such a small difference it’s nothing I would ever focus on for success.
My Opinion - I've written and spoken about this extensively in the past and it doesn't need a great deal of re-hashing. I will, however, say that should any SEO ever discover that it substantively impacts rankings, we're going to be faced with an army of zombie botnets trying to take over our computers not to send email spam, but to click on links through our "reputable" Google accounts. Just look at the hacks of Facebook, Twitter & Wordpress over the past few weeks and ask yourself - if any spammer could show any financial incentive or ability of clicks to influence Google, would we really have as (organic) click-fraud free a world as we do today?
We do have one data point from Google that suggests they look at some kinds of less manipulate-able click data. A Googler speaking at the first SMX East show in New York mentioned during his session that Google will record searches that are performed frequently with no clicks, followed by query refinement or abandonment, as potential searches that need work (because it seems no one likes the results). If this is what you mean when referring to click-data being used in the engines, I think that's completely reasonable.
Do H1 Tags Help with Rankings?
Stephan Says:
H1 tags are a crucial element for SEO. Research by SEOmoz shows little correlation between the presence of H1 tags and rankings. Still, you should write good H1 headings, but do it primarily for usability and accessibility, not so much for SEO.
Comments Include:
H1 tags are very important, I’ve seen pages rank well for targeted keywords once the tag has been tweaked to be more targeted, not spammy or purely for SEO, but well written. Ok, in some cases it may not be “crucial” but after the title tag I think it’s up there as one of the most important on site factors.
My Opinion - Covario's research is spot on; I got to listen to and speak with their chief scientist, Dr. Matthias Blume, at a conference in Silicon Valley. It also matches up to our correlation and rankings model data. You're invited to repeat on-page keyword prominence testing and check the results for yourself (more on search engine testing methodologies here). H1 tags are very slightly better than Bold/Strong tags for keyword usage and both are barely better than simply using the keyword on the page (in any text format).
In every instance I've seen a report of H1s improving rankings, it's been because the keyword phrase was now included as some of the first text on the page and provided an additional instance of the target term and title element in the on-page copy. As Stephan recommends in the comments, try taking a site with H1s and replacing them with CSS styles that mimic the text formatting. You may see tiny fluctuations in a few close rankings, but likely little else.
All that said, H1s are still a best practice. If you're building a site from scratch today, you should certainly use them for headlines, and they do provide some (albeit quite tiny) benefits for SEO. However, I feel incredibly guilty about the many times in my SEO consulting career I pushed hard for engineering and development teams to get H1s right in the markup when it generated such tiny results. That time would have been far better spent on dozens of other projects. If I can, I'd love to save you that same embarassment and disappointment. H1s may fit with SEO stereotypes, but that doesn't make them a high priority, high value activity. If you don't believe the research of others, do your own, then listen to the results.
Can Linking to Other Sites Help You Perform Better?
Stephan Says:
Linking out (such as to Google.com) helps rankings. Not true. Unless perhaps you’re hoarding all your PageRank by not linking out at all — in which case, that just looks unnatural. It’s the other way around, i.e. getting links to your site — that’s what makes the difference.
Comments Include:
Not true. Matt Cutts has said that linking out to high quality websites is one of the many factors that they use to evaluate a site. NOTE: the comment references the below copied text below from this post by Matt Cutts (on Google's webspam team):
Q: Okay, but doesn’t this encourage me to link out less? Should I turn off comments on my blog?
A: I wouldn’t recommend closing comments in an attempt to “hoard” your PageRank. In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.
My Opinion - I suspect there may be some small, positive effects of linking out to relevant, quality sites and pages for SEO. However, Stephan's likely correct in his assertion that just linking to a "high Domain Authority" or "high PageRank" site won't normally help. He's also right to say that hoarding link juice is likely a very bad move. You can listen to the NYTimes' SEO, Marshall Simmonds, talk about how adding external links to articles on the site had a noticeable positive impact on the Times' rankings and traffic.
I don't have correlation or ranking models data on this, nor have we experimented internally to the degree that I'd feel comfortable calling this a settled debate. My instincts say Google probably considers outbound links in some form or fashion, but I doubt it's a huge ranking factor. It might be more important than H1s, though :-)
PageRank is a Good Predictor of Rankings?
Stephan Says:
Your PageRank score, as reported by Google’s toolbar server, is highly correlated to your Google rankings. If only this were true, our jobs as SEOs would be so much easier! It doesn’t take many searches with SEO for Firefox running to see that low-PageRank URLs outrank high-PR ones all the time. It would be naive to assume that the PageRank reported by the Toolbar Server is the same as what Google uses internally for their ranking algorithm.
Comments Include:
Come on now. It’s true that a lot of people place too much emphasis on PR, but let’s not take it to the opposite extreme and say it’s irrelevant. PR is not the be-all-end-all of rankings, but it still matters. Having a high PR homepage clearly means *something*.
I probably couldn’t disagree with anything more than this one. I guarantee a website that has homepage PageRank 6 and then 2 page deep pages having PageRank 5 and trailing off into 4’s and 3’s get’s WAY more traffic than the one with PageRank 3 and trails off into 2’s and 1’s. PageRank is not 100% accurate, but it’s an extremely good indicator, it’s not just make believe or useless non-sense that authoritative sites have PageRank; 6, 7, 8, 9, 10.
My Opinion - They're both right (though the "guarantee of traffic on the PR6 vs. 5 site" sounds like a bet this commenter's opponent could win many, many times over). Our data on PageRank correlation is very solid and suggests that yes, PR is positively correlated with rankings on Google.com (though much less so in Google.co.uk - sorry Brits!). However, the degree of correlation is not overwhelming and there are far better single metrics if rankings correlation is your goal.
I would strongly get behind Stephan's statement that what the toolbar server reports is not what Google uses internally. They've messaged this many times. It's also very true that PageRank is only one of a plethora of ranking signals, and plenty of PageRank 3 pages outrank PageRank 6 or 7 pages for given queries.
Does Great Content Equal Great Rankings?
Stephan Says:
Great Content = Great Rankings. Just like great policies equals successful politicians, right?
Comments Include:
I see no one is criticizing "Great content = great rankings." This is job number one.
My Opinion - I think the commenter may have missed Stephan's intended sarcasm. I am in full agreement that great content ≠ great rankings. This is no more true than the statement: "the way to win elections is to propose the best legislative ideas."
Marketing, promotion, networking, partnerships, virality, incentives and hundreds of others feed into the inputs for a site's success on the web. Unless you believe that links are meaningless and Google's content analysis systems can read and rank content like a human (e.g. Google thinks the Times' article on Brown's stepping down was more adroitly perceptive than the Post's), the ability to draw in links, which is not and likely never will be about the "best content" will have an overwhelming impact on rankings.
The future likely holds greater usage of data from social media and social web interaction, but even this depends on far more than the content's quality. Those brands and sites that have early-adopting, viral-sharing, people-connecting, idea-distributing users invested in promoting their work are likely to be long term winners with little regard for comparative levels of content quality.
There's lots more fun and interesting discussion on the SearchEngineLand post, but hopefully these will spark some interesting chats in the comments here as well.
Great recap and additions.
One of the biggest issues with SEO statements is people want to take them as extremes. They want a simple, pat answer that covers any instance. It just isn't that simple. This is why DIY or simply following a checklist can be so hit or miss, without some kind of practical knowedge or experience to back it up.
I heard a little tale that has stuck with me and is the perfect context for SEO (and most things in life):
And SEO also reminds me of something a high school teacher once told us: "The more you learn, the less you know."
Every new bit of knowledge and experience in SEO reveals so much more. The best SEOs still consider themselves students of SEO, whereas the hacks believe they have it mastered and know it all already.
Or as the old Socrates said once:
I know that I don't know
I love that Brian - think I might borrow that analogy.
Feel free.... I did ;)
That's an awesome ham story.
hi identity, i agree with that TOTALLY..... there were days i wanted to quit this whole SEO thing coz everyone else seemed know better than me..... this happens only when i get a little confidence about SEO.... its always up and down... "the more you learn, the less you know" no better words to explain this...
Love the ham baking tale. My wife has told me many times that cutting the ends off of cucumbers and then rubbing the cut end on the cut cucumber will make sure the cucumber isn't bitter. Ok, don't know how this one relates to SEO but it seems to work!
Great recap of a great post and good resume of the SEOmoz opinions about some of the most discussed myths.
Of all the miths listed in the Stephen post there are three that, incredible but true!, are very very hard to fight, especially towards clients not fond of SEO and that are easily con by so called SEO expert.
I'm talking about Myth 1 (Our SEO firm is endorsed/approved by Google.), Myth 6 (Meta tags will boost your rankings) and Myth 30 (Using a service that promises to register your site with “hundreds of search engines” is good for your site’s rankings).
At least they are stereotypes i've to fight against a good 70% of the times I'm contacted by a client.
Finally, it almost ironic that when it comes to SEO, that has lot to do with (key)words, many discussions are originated by semantic interpretations of the same definitions :).
"Our SEO firm is endorsed/approved by Google"
This is not a myth but a deceptive way (or better say fraudulent way) to incite clients. What Google has to say about this:
Beware of SEOs that claim to guarantee rankings, allege a "special relationship" with Google, or advertise a "priority submit" to Google. There is no priority submit for Google.
Source: https://www.google.com/support/webmasters/bin/answer.py?answer=35291
Using a service that promises to register your site with “hundreds of search engines”
“Avoid SEOs that talk about the power of "free-for-all" links, link popularity schemes, or submitting your site to thousands of search engines. These are typically useless exercises that don't affect your ranking in the results of the major search engines -- at least, not in a way you would likely consider to be positive.” Source: https://www.google.com/support/webmasters/bin/answer.py?answer=35291
Now you don't need to fight any more. Whenever someone has a doubt refer him/her straight to Google :)
Thanks SHimanshu for the reminders, even if I too know well the answers. When I was meaning "fight" I was meaning the same fact to show those answers to clients and to explain them again and again... in a sort of mantra.
And about the "Google endorsement" it's somehow really a myth that can make decide a client to choose for the con and not for the honest one who answer: "No, we have no official endorsment by Google, as Google doesn't have special relationship with any agency"... as they tend to think with an 'endorsement state of mind' (and that's why - as a pure marketing tool - can be useful to show any Google certification you can have, as Adwords or GAnalytics, which are not endorsements but quality statements yes).
oh come on they have a badge and everything, it has to be real!
Hi, Rand. Did you mean to say "I would strongly get behind Stephan's statement that what the toolbar server reports is what Google uses internally"? Is there a "not" missing? Stephan had said: "It would be naive to assume that the PageRank reported by the Toolbar Server is the same as what Google uses internally for their ranking algorithm."
Oop - good catch; all fixed.
A lot of my comments are quoted there... a little credit would be nice.
Sorry about that - I didn't attribute the comments this round and recognize it wasn't a particularly awesome move :( I just ran out of time.
"I am in full agreement that great content ≠ great rankings"
I agree with your statement in principle, but surely the best way to promote your site/increase virality & links is to have great content?
Only once you have great content (/policies) will you get good rankings (/win elections).
I think the point that needs to be made is that not one single aspect of SEO will get you top rankings, but they all follow hand in hand in some way or other.
Dan
Hey Dan - I certainly don't mean to suggest that one should not produce content that is of high quality and valuable to searchers. I'm simply saying that producing that content will not, by itself, get you rankings. Branding, marketing, perception, technical optimization and hundreds of other inputs go into determining SEO success and I worry about the naive assumption that the "best content rises to the top" philosophy.
SEO is full of many such "necessary but not sufficient" conditions.
I think that click through rates do matter, but in a far more complex manner than would be easily gamed...
Cutts said at PubCon's state of search speech last November that "every pixel must be justified by click through" (paraphrasing). Of course, he was referring to local search and vertical creep at the time, but I think that it would be ludicrous for Google not to pay attention to such a powerful metric.
First, I think that Google is looking at CTR in terms making sure the whole front page SERP is valuable. A page may receive less CTR than normally expected at position 4, for example, but will retain its rankings because the people clicking on #4 are individuals who are not clicking on anything else on the page (ie: a different search demographic).
The page that is likely to have difficulty is one that only gets 2nd-hand-clicks (ie: someone clicks on page 1, doesnt like it, then clicks on this page in question) and then, those people proceed to click back and choose another.
I also think that the argument that SEOs would easily be able to game the click metric is unlikely too. The same metrics Google uses to handle click fraud in Google Adwords can be used to devalue and/or ignore generated clicks on the natural SERPs as well. Writing a bot to handle these types of clicks is tricky because the tracking is generated by Javascript. Moreover, the ubiquitous nature of Google Analytics would allow Google to devalue any clicks that land you on a page with Analytics where the tracking didnt fire (in bot making, we call this "failing on the follow through").
I am not saying that I think Google weighs CTR with any greater emphasis than the many, many known metrics which influence rankings, but I do believe that it is an essential metric in creating an effective search engine.
What user's do is by far a better signal than what webmasters do, IMHO. The catch is that Google has to use non-user signals to construct the rankings page in the first place, and then can use these user-signals to refine them.
Fun stuff! So much contention. Just like politics.
I'm with Stephan on 100% of this.
The people that perpetuate most of these myths are the same who sell unsuspecting folks on SEO scams like these.
This is a great read. I do think there is a bit of semantics arguing going on, but nonetheless these points are all valid and important points to consider.
I'm glad we have such great people in the industry to share their experiences, mistakes, and successes for others to learn from!
I'm impressed that you managed to not only get a "not equals" sign in your post, but got it in anchor text! That post now probably ranks #1 for searches for that character :-)
On to a REAL comment, now: a few years ago I ran a test on two nearly identical websites I had, one offering Fiji travel, the other Tahiti travel. Both had a couple hundred pages, both were ranking about the same (and quite well actually, they weren't weaklings) for "fiji specials"/"tahiti specials" etc. I tried a little experiment where on the Fiji site, I added a sidebar of about 30 links to strong Fiji sites, like the visitors' bureau, a Wikipedia page, etc. I didn't actually want to lose visitors to those great sites, so I cloaked that sidebar full of links and only showed them to the spiders (I know, BAD DOG....I'm much better behaved these days).
The Fiji site showed a pretty dramatic improvement in rankings.
I'd be VERY interested in hearing from anyone who'd tried this kind of side-by-side test today. I no longer have those sites, or I'd do the test.
Here's a thought. Maybe Google does use clickthrough rate but uses all sorts of data to determine and make sure it is an actual user, the same technology they use for Adwords to weed out fraud. Such as when the search is made, where does the user click? Do they click the first link like a real user would naturally do? How long do they stay on that site and then come back and click on the next link and so on to get to yours. So if it was click fraud, Google could see it very easily as the user would search, immediately click on their link and then be gone, not what a real user would do. Plus since so many sites use G Analytics, they have all that data to use as well, the amount of time on clicked site, etc.
For what it's worth, the impact of CTR was tested in SEOchat by GazzaHK and a number of comrades a couple of years ago (well worth reading through, IMO) and was shown to move SERPs. i have friends who tested this independently and and also found it to move SERPs. A new experiment along similar lines is slowly brewing here, if anyone wants to volunteer and help us click.
I'd just add that all of that sophistication and more has been added to help stop viruses, malware, email spam, etc. but all of these still get past the filters.
Even if they could rule out machine-controlled clicks, there's still Mechanical Turk and the like to contend with (where you can hire actual humans for pennies to perform small tasks).
I'm not saying it's impossible, nor that it has no impact, but I think it's extremely unlikely they'd give it a substantive, consistently measurable impact. The results are just too dangerous.
(but who knows, I've been dead wrong about things like this before -I used to recommend buying links, used to not recommend XML sitemaps, used to focus on H1s - clearly I'm not perfect).
Who doesn't do wrong evaluation, Rand? The good is to be able to recognize them and not try to justify them.
Thanks God the SEO Community - even with all the possible clash can exist between strong personalities - is one of the most open I've ever found in the professional world, and it's the Community with its openess about best practices that makes possible to understand the mistakes of the past and the opportunities of the present/future.
Having experimented another kind of business enviroment before (A/V Media), I can say it as I've known the opposite.
That's why I love this community and started participating, blogging and attending events. The Internet is a rough place to share thoughts, opinions and facts, but the SEO world seems to have pockets of tremendously talented yet still somehow humble and open-minded people. I feel lucky to be a part of it every day.
While the SEO community as a whole is a really nice one to be involved with, the sub-community here in mozland is without par.
I don't say that because it always has the best information. I say it because it's an open, honest environment where civility reigns, helpfulness abounds, and the only time anyone ever gets pounced upon is when they are trying to put spammy links in.
As everyone knows, to reach the most customers possible, every business should try to optimize their listings to ensure the highest CTR regardless if there is any SEO impact. If there is any impact at all, more the better. I guess what I am saying is people should focus on getting the highest CTR to get more customers, not beause it may help them in SEO.
Just like you do not want your title too long and have Google cut it off with .... I read a report that said it descreases your CTR for some reason. I emailed Zappos.com CEO about it months ago as they were guilty of it and looks like they finally took my advice.
Great opinions Rand, but I'll be one of those guys fighting the CTR battle. Mechanical Turk is really a non-factor IMO. Google could easily identify these users by IP addresses. I think 99% of Mechanical Turkers are NOT US based for example. I'm not suggesting that it's worth while to game something like this, but that doesn't mean it's not a factor. As others have mentioned it's just good practice to focus on page titles, meta descriptions and URLs to increase CTRs regardless.
The link graph is gamed every day by most savvy SEOs, but Google just works to improve they way they filter links, etc. I would imagine they would do the same with user stats such as CTR. If these stats are significant to Google one of the few ways to truly game and test it would be to manage a true distributed workforce that represents the target market's patterns, behavior and geographic location. The last place this topic should be mentioned on is a Myths post.
Wow, I'm flattered that you referenced my comment and linked to my blog post (though I see it's nofollow, lol).
I really enjoyed Stephan's original post and also enjoyed reading your views on the issues. I think Stephan and I agree on the personalization issue, we just were using the words a tad differently.
All the PR and meta keyword related myths are really, really old. I wish they would die already. :-P
I think that's just a result of the copy/paste from SELand's comments - I should go take out the nofollow... Thanks for following up, Adam!
Thanks for another well thought out and detailed post.
I remeber a great article written by you (Rand) for SEMOZ entitled The "perfectly" optimized page.
In this article you have stated the following:
"Word Separators – Hyphens(-) are still the king of keyword separators in URLs, and despite promises that underscores will be given equal credit, the inconsistency with other methods make the hyphen a clear choice."
To be clear I understand that you are talking about hyphens in domains in this piece rather than keyword separators in urls.
However what would you think is the best domain of the two below for SEO - not for branding or link acquisition or even direct entry but purely from a search engine ranking standpoint. Do you think search engines provide more/less weighting to the word bakery in the hyphenated domain?
Example:www.janes-bakery.com or www.janesbakery.com
Before reading this post I would have automatically opted for the hyphenated domain as it separates a target keyword (bakery) but this may be a SEO myth held by me that needs to be busted!?
I would be interested to hear your views?
I always just think of how easy it would be for someone to pass on the site over the phone or word of mouth.
Most people I think would end up trying to go to JanesBakery.com and instead of finding wonderfully buttery croissants and treats they would get a blank index not found page.
I think the value of being easily remembered would outweigh the smaller seo value. I also would be interested in others views on the value of hyphenated domain URLs.
I never have posted a link before in the comments so just let me know if I should remove it =)
Well it's a yummy link, so I think it's okay this time! =)
Pete - my apologies! You are absolutely right and that's my bad for not considering how folks might interpret the advice in that post. I'll try to make an edit ASAP.
On edit: Just updated the perfectly optimized page post.
That's a huge point of confusion, and Google isn't much help. In the domain name, hyphens seem to be a negative, but they're the preferred separator in the rest of the URL (after the TLD). In fact, I've seen many cases where Google can't or won't separate words in folder and page names, if they're just combined, even though Google easily separates words in domain names. It doesn't really make a lot of sense.
@Dr Pete
Do you think it comes down to google allocates more resources initially around the domains by crawling the WHOIS databases and domains are more easily defined. They also usually only have to do this once, where content and folders maybe dynamic so Google has to relearn each time you change or add new content?
Also they can see in the Whois data the registrant is "the lost agency" so they can work out the domain thelostagency.com is likely to be 3 words "the lost agency" which match against the Whois data.
Another point, there is defined and standard formats for domains that are consistent so are easy for software to understand based on rules. Where folders and page names may be different and use: %20, -, _
https:// (can ignore)
www (understands if a subdomain or not)
. (start of domain)
thelostagency
. (end of domain)
com (defines the tld)
That's a completely fair point - there probably are real technological issues, as the domain is much more constrained than the rest of the URL. I just like to complain :)
yep just a point :)
If hyphens are better, couldn't you just get both domains and 301 the one without hyphens? That way you could give janesbakery.com verbally, but that domain would redirect to janes-bakery.com.
I mean, it'd be good to own both domains anyway.
Great post, Rand. I think a lot of people just disregard certain SEO beliefs as myths when they're partially true, and vice versa too. Thanks for the clarification with this post.
Alright, to me most of the mentioned were no brainers... just common sense. Think about it people. Maybe I'm the acception here but using accurate ANYTHING is key to attracting a target audience. As per domains you don't want some long run on domain... of course not, but in terms of key words... yup it counts... (duh moment!). Keyword density... if you want to rank for a key word OF COURSE you should have it in your content and in your H tags no matter if it's H1 or 2 or H6... just use it. What I say is cross all your T's and dot all your I's. Using poor sites for link building... duh... that sucks. It is better to get links from high pr relevant sites in small quantities rather than loading up on crappy PR 1 and 2 sites.
Great post. I have to agree with everything you explained. However, I'm still a little frustrated that the rules of good SEO constantly change (and are really unknown). It's like trying to play a sport where the rules constantly change and are never published!!
However, I understand why, and understand that is simply how this game is played. At the same time, it's good to know there are players out there working hard to help the rest of us. (And if you don't like the rules, don't play!) Thanks!
The flip side of the coin Chad is that it's job security for SEO's :)
Cracking post. I particularly like Rand's honesty. I think we all have slavishly believed in one SEO technique or other in the past! I think what separates good SEOs from the others is less slavish following and more independent thought, observation and testing. SEOmoz live and breathe this approach. Jeremy.
I can certainly see why that post generated so many responses. I especially like point number 24 - I have heard this so many times myself.
24. "SEO is a one-time activity you complete and are then done with. How many times have you heard someone say “We actually just finished SEOing our site”? It makes me want to scream “No!” with every fiber of my being. SEO is ongoing. Just like one’s website is never “finished,” neither is one’s SEO. "
When they say this, people are oft referring to the code behind the site. You can definitely get to a point where you've finished the on-site SEO at that point, but not necessarily to the point where they'll never have to change it again. What I mean is, they've finished implementing the strategy, finished making sure everything is usable, semantic and fast and then they just have to sit back and wait for results and monitor. That's why I said a site is "finished" with SEO.
Once again, take it with a pinch of salt!!!
Between the H1 tags, keyword usage and "content is king" benefits all being debunked, 90% of my white-hat, traditional approach I base my work around has been dismissed. I'm so depressed.
Don't be depressed, just advance, develop and expand on your current skill set to help your clients' and yourself.
There's no wonder you are one of the most authoritative individuals in SEO. Thanks for sharing you thoughts.
I completed some onpage tactics for a client not too long ago and have since seen the site traffic jump 150% after doing so. I have a hard time believing that none of these factors have any influence on a site's rankings. I mean, from the sounds of what is being said, any onpage is a waste of time. I am not saying it's the be-all-and-end-all but it certainly forms a good foundation. Am I wrong?
Steph, my takeaway is: Header Tags have been slightly devalued, Keyword Density is a red herring (which does not mean that Keywords themselves are, rather that a specific frequency of their use is bogus) and while good content is a good thing, it's no guarantee of top results.
But there is no way that On-Page is dead. If you have a relatively non-competitive space to compete in, you can totally rule with good on-page (combined with linking of course)
Good point, and I echo what others have said here:
Yes on-page is important, but some people think that if they do nothing but put a keyword in their H1, they'll be on page one.
I think that's what Rand is pointing out. If your site has no on-page SEO, adding title tags, H1s, and sprinkling keywords throughout the site will absolutely improve rankings (I've seen it dozens of times).
But an H1 does not page one rankings make.
Steph - on the contrary, I hope that's not the message I'm sending; I'm pretty sure it's not the one Stephan's sending either.
Keyword usage is important - keyword density just isn't the right way to measure or message it.
H1s provide a small amount of value, but barely more than just using the keyword(s) at the top of your page.
Titles are certainly critical, we think there's likely some good things going on around relevance detection, semantic analysis, etc (haven't been able to prove these yet, but it's a project in the works). Link structures are absolutely essential. Getting rid of duplicate content, effectively using XML Sitemaps, buildind smart site architecture - these are all effective and valuable techniques that no one's questioning.
Maybe I need to do a better job writing to make the disagreements clearer and call out what's still left as important tasks.
I have just completed a round of on page for a client...implementing what I see as best practice...(including ALL meta tags, H1 tags, internal linking and increasing the keyword usage on each page) all the things that are on page 101 elements.
Whether or not search engines look into these factors, of the 120 search terms being targeted, what i have seen is that the results on Google have barely moved, maybe a couple of positions (guessing its the title tag)
Whereas, on Yahoo, the results have gone crazy!!! now ranking on the first page for about 50 of the terms.
The only thing I can take from the work I have done is that Google is still in love with links (bought or manually earnt) and Yahoo is loving the on page
So, with any new clients, I will always recommend good on page optimisation before any other work. Even if it is just for Yahoo results.
I think a lot of the time these posts can easily be taken the wrong way. People either misread or misunderstand what is being said in the article and assume that it's devaluing their work or making it seem unsuccessful and a waste of time.
Remember, you should always take any articles you read online with a pinch of salt. I'm not questioning the validity of this post, but many of us have personally seen results with certain on-page changes that have been discredited or devalued in both this posts and many others.
It does not mean that it won't work and it certainly doesn't mean that you should stop doing it. Providing a semantic, search-engine friendly site is something that every SEO should be doing anyway.
You're not wrong, and it's not very often that people are wrong when it comes to what they've seen first hand. Sure, people may try and say that you are because you have no solid evidence but if you know you're right then just keep going. Just remember that you need to adapt and evolve too. Don't just rely on on-site - I think that's the point they were trying to get across with these articles.
Many thanks for all of your excellent responses everyone. I have to admit I felt slightly panicked when I read the post - thinking that after several years in the business I was reduced to the status of noob again. (Of course this is a slight exaggeration.)
Like all onpage (well, except maybe meta keywords) every little bit working together can cohesively form a strong onpage strategy.
goodnewscowboy - I agree. You can rule a non-competitive space with onpage and link building. Hence why this post frightened me a little bit.
mikevallano - "But an H1 does not page one rankings make" Ha! Very true wise one.
Rand - thank you for listing all the points you still feel are valuable. This would have been helpful to include in the post, but you can't always be expected to think of everything. Thumbs up to your post as it's provided fantastic food for thought.
theexo51 - You're so right. Another thing I've noticed Google loves (versus Yahoo!) are domains with keywords in the titles. I am surprised this tactic works so well still actually ...
traxor - I realizeyou need to take everything you read with a grain of salt (I've been in the biz a while), but I still highly value the opinion of Rand. Always have. Now that I have the sole responsibility of running a couple of web companies SEO departments, I feel a lot of pressure to get it right! With small monthly client budgets I don't have the hours or budgets to play around too much and figure it out. I totally agree with you - in this inudustry we need to constantly apadt and evolve. (Which is what makes this such a great job. Easy is boring.)
Yeah, I'm not discrediting anything that Rand is saying. I respect absolutely everything that I say, but I really feel for him in the sense that a lot of people have misunderstood what he's saying, which is why I say "take it with a pinch of salt."
Easy is definitely boring, so it's great to be in a profession that evolves so quickly!
Very very very very Goo article as usual even though I do not 100% agree with several points.
A lot of the comments on Stephen's article seem like they are based on personal experiences rather than the research and correlation data so I would agree more with Stephen and Rand.
It is a challenge with a lot of the misinformation out there, but if you follow trustworthy sites like SEOMoz you will be in pretty good shape. However, even SEOMoz didn't catch on to the change in Google's nofollow policy until several months after the change. If you communicate to clients that Google is like a black box, and you are working with the best information available, clients will likely be more forgiving.
best advice would be to just test it out yourself and know with at least some confidence where you stand on these topics because experts you respect and follow in the SEO industry have conflicting (valid) opinions
Hi Rand.....some one asked here, what is TF-IDF? I wiki'd it... there was something interesting.... the article says "Hence an inverse document frequency factor is incorporated which diminishes the weight of terms that occur very frequently in the collection and increases the weight of terms that occur rarely."
If this is exactly the way search engines work then increased keyword density can backfire.... I think there is some balance that search engines use to find related words and then pick up references within the content to determine the right word to rank... ???
in regards to myth's.... i dont think something is a myth even if there is little use in doing it.... there are so many signals search engine use to rank.... if H1 is just .01% of it I wouldnt call it a myth and disregard.... I think the point you and Spencer are trying to make here is "do not overdo anything, keep it real"... I liked both the post... but didnt quite agree in some places with Spencer , just the way he has expressed it... If someone's true intent is to misguide then its ok.... there are days that everyone does a thing or two and we learn by it later... afterall most of our knowledge is built with these experiences...
anyways the posts were good and thank you very much for your efforts.....
I've been seeing increasing numbers of my clients telling me about how SEO companies have been claiming to be Google partners when pitching SEO services. While it's not overly harmful given said companies qualifications, it's deceptive, and should be put to a stop.
A questionable area is updated content on the site. One of my clients hadn't updated their blog in about 3 weeks and had dropped about 4 positions down in the SERPs. As soon as they'd updated their blog with new content the site shot up again in the rankings. It's hard to know if this is because of increased semantic content or not. There was no increased inbound links so whether it was the content or the recency. It's hard to know for certain what drove the rankings up. It's questionable to wonder how much recency plays a role in signals against sites posting relative content. While inbound links and other signals are much stronger it may be something to pay attention to.
I've discussed outbound links with other SEO's and there's been evidence of increased rankings in local search especially. Example a company in Edmonton linked to West Edmonton Mall's site wem.ca and promptly jumped 20 positions in the SERP without any other changes. It's questionable as to whether PR can flow in moderation back to the inbound link site much like a nofollow releases some PR or if not just for the pure semantic relevancy of the other site. I noticed that when I nofollowed a ton of sites from a blog the overall rankings went down instead of up. I'd imagine there's got to be a healthy ratio of inbound relevant links to outbound to pertinent semantic content. If you think about it it doesn't make much sense to penalize sites for outbound links to relevant content. That's almost as bad as personalization on the SERPs.
Stumbled across this post via Google when looking for inspiration / ideas for my next Tidy Design blog post... Some interesting thoughts / comments here, even if they are over a year old! How can you not love SEOmoz!
I completely agree with Stephan Spencer, not just in theory but in my own experience as well.
what is TF*IDF?
I had to ask that too - TF*IDF - Term Frequency * Inverse Document Frequency
If you go up the comments, Rand linked out to a full explanation
Interesting observations Rand. Question - If using keywords in header tags has little on no influence isn't it time to make some changes to the Term Target tool?
Best,
Neil
You're absolutely right, Neil - we've been really hard at work on a product that's going to be replacing many of our tools with vastly upgraded versions that should be launching in July. I wish we could make things faster, but our engineering staff is going as fast as they can.
Yet another Great post!
I try to avoid Hyphens at all costs, I always ask myself what would I have to say to a lead when referring them to the website over the phone "Have a look at hotel hyphen for hyphen dogs dot com"
No thank you
But thank you for the post ; )
Rand, you obviously took a lot of time writing this post and stating your position on a lot of contentious issues. Did you feel compelled to write this post due to perhaps not agreeing 100% with the opinions of one of the co-authors of your SEO book?
I hate to say this but if you beleive the data from Google Web Master Tools, personalization has some of my keywords ranking at #1 through #50. As far as the home page, I have had sites that rank really well with the same flash running on the home page for months at a time. Links - Don't join the link farms. Build your links the right way, not the spammy way. Keywords - that depends. If you're running a website about trucks - use the word trucks. Don't try to use some fancy euphamisim for trucks. If you want to be snooty and avoid the word trucks in your copy, you're going to pay the price. Hyphenated domains - HATE THEM - all you do is send traffic to your non-hyphenated competition. Every time your sales guy says come see us at my-domain.com, your customer is likely typing in mydomain.com. Click Through - Agree. Headings - not just the H1, but the H2, H3, etc. also help pages rank better. Much better to use the heading tags than bold tags. Page Rank - Agree Content - Agree. You can have the best content in the world and still have sucktastic rankings if you don't do the rest of what needs to be done to rank well.
Thanks Rand - another insightful roundup - I was hoping you would keep the debate going over here :)
Great work from Stephan too as usual.
great stuff.....off to read the full list now ;~)
*Warning - Noob SEO Alert*
is there a glossary for all the terms used here?
QDF? TF*IDF??
i think QDF is "query defined freshness"
let me write that down :) - or should i not bother as it doesnt seem to matter :S
gotta love this industry
haha....so many terms...so much to do.....maybe I should do something simple like become a physicist. On a note about the QDF. I think Rand has explained it by stating that if you have some news out on your blog for example.....the robot crawls it and it has quite a bit of other queries going on about the same subject (i.e. hockey playoffs) you may be ranked higher (although temporarily) for hours/days weeks until the interest in the subject subsides. I think what Google is trying to do is give the surfer the best quality user experience by giving the most recent relevant posts.....does that make sense? If it doesn't, ask Rand ;~)
as a Brit, not sure on the hockey example lol....
but that does make sense...just the TF thingy to learn :)
Sorry about that!
QDF - Query Deserves Freshness
TF*IDF - Term Frequency * Inverse Document Frequency
Hellow, I am rowan atkinson.
A very interesting post Rand. I personally think that original article raised some interesting points which you have covered off here in an objecive manner. Thankyou sir,
I think a lot of sites with a higher PR rank higher, not because Google uses that data, but a higher PR site will naturally have more links pointing to them and hopefully many of those links will have the right anchor text.
Nice post Rand! I think we are all a little guilty at times of becoming like sheep and listening more to what others say rather than do our own testing and find out what produces the best results for ourselves and draw our own conclusions.
Recip links don't work, it's bad to link out, H1 tags have no value, blah blah blah. There is just so many differences of opinion out there. Must be terribly confusing for CEO's and prospective clients reading so much contradicting advice.
At the end of the day its results that count and testing is a vital part of any SEO's arsenal.
Great post...
With reference to:
Can Linking to Other Sites Help You Perform Better?
...does this mean, if we link out to Wikipedia or BBC from a blog this gives you more authority and trust ?
My understanding would be that if the link to those sites was relevant and in context to the content of your page, then there may be some benefit to be had..
I often advise my clients to link to these sites if it is relevant, as i do it for my own sites. If you are talking about something and its helpful or relevant to that page link to it!
If you run a travel website and don't sell travel insurance, linking to one or two travel insurance companies that you trust helps the web overall and makes your website more usable. It can be better for your visitors if you have the time to make sure you deep link to the relevant page.
If mention businesses or firms and the users have to leave your site and Google them to find the address, i would think thats a bad user experience and Google may see that as a bounce.
I'm not an expert but it is logical that if they have an interaction within your site and not having to go back to a search engine, that is positive they may be more likely to return or promote your travel site as a good source to friends on facebook/twitter/blogs...
If you are a video production company and have done work with the BBC, linking to the relevant page/program you worked on is a good idea. If you use or paraphrase something from Wikipedia I wouldn't say its a bad idea even if it is just mentioned in the footnotes as a reference source, it shows more transparency for your site.
Thats a nice post Rand
Interesting post. Good point about the usage of H1 tags. Many sites profit from using H1 tags due to the usage of target keywords in their headlines instead of not using target keywords in their content at all..
Great to see that people are saying you dont need to frequently update your homepage. Some people can definately take it over board!
I still think there is a value to having "Latest posts" / "Latest Press Releases" links on the homepage so as to make users and bots aware of the new content.
could you not just update the xml sitemap with the new content?
Im in agreeance (sp?) with stephen on this one, so many sites I come across with searches that include a specific day are from years ago with no new content.
The one aspect of SEO that every serious blog site seems to agree on is to build links to the website. (of course, how we do that does change slightly from site to site, but its still the number one thing)
A little bit of knowledge is a bad thing, and unfortunatel, as SEO's, the majority of people we speak to have little knowledge- thats why they speak to us. Trouble is how to communicate the "best" way to do something when no one can agree on it.
Really cool post! Great to read and -as usual - good comment by @gfiorelli1 (oh so true!).
This was an important post and while SEOmoz has address these items time and time again it was nice to see two large figures in the SEO industry for the most part agree on many of these myths.
Thanks for providing the insight even if it isn't making you more popular.
Did you just call Rand fat???
DON'T GO THERE GIRLFRIEND
Oh man trax. I took a gamble at humour and I got the dreaded thumbs down. Snap!
Boy, this is one of those come-back-to posts when you get into debates about the topics. Thanks for the analysis.
Wow! Great debate here... I just like it this way! :)
It was especially fun when I read the information concerning Keyword Density... Things such as Scribe SEO, which has poweful affiliates (ex: Brian Clark), make use of it...
It made me wonder if this stuff was really useful when I started in the SEO field... and eventually had me quit. :)
It's wonderful to have Spencer vs Fishkin opinions... I couldn't know which one was (imho) better while reading at the Art of SEO!! ;)
I like the analysis and pretty much agree on all parts. Well done.
Rand.. thanks for the great commentary. That's what's so great about SEO. Without having the actual algorithm, we all have to works with our experience and research. I'm not overly vested in the research so it's great to hear what you and SEOmoz have found scientifically. Keep sharing the great insights!
I think this explains 75% of ranking well, but there still seems to be about a 25% mystery part of Google's algorithm.
Since everybody is incorporating these strategies now, the trick is solving that last 25%.
Detailed, as always.
Thanks for posting this Rand. A very informative and enjoyable read.
Ya know what I love most about this article is the poorly thought through responses to Stephans remarks. I’m a big advocate of the KISS (keep it simple stupid!) theology, especially when it comes to SEO.
Build a good quality clean webpage/site
- then -
Spread the word about it!
The rest is fluff...
Rand it right on with many of his opinions, whether scientifically supported or not. So many things in the world of SEO and online marketing are common sense, but too many of us are interested in finding the silver bullet to see that.
On a general point of view, you are right. Simplicity and common sense must always be reminded in order to not loose in speculations that do not lead to anywhere.
But I don't agree with the quite simplistic theorema you express (no offence intended).
Infact I've seen so many good quality webpage/sites that didn't do it to have the success they could have deserved even if the webmaster was shouting the word about it.
Exaggerations from the both sides a not the safest way to approach web marketing (or anything in life).
I think it totally depends upon the space you are occupying. If you have little to no competition, I completely agree.
If you have heavy competition, it's an entirely different ball game. For the really heavy hitters (mortgages, etc.) I'd go so far as to say it's a different sport!)
Another great post! Thanks Rand!!
I wrote a Google knol about a similar topic a few months ago https://knol.google.com/k/jay-granofsky/seo-lies-and-myths-revealed/3udnymr0y2e8g/11#
The one thing I disagree with you on, is that page rank has a direct corelation to good rankings. I have found so many low ranked sites that perform amazing in the SERP's that I know it can't have much of an effect, in fact my highest ranking and highest Google traffic site has been PR0 for the past year and a half!!
Great and interesting post Rand. I personally think that original article raised some interesting points which you have covered off here in an objecive manner.
A very interesting post Rand. I personally think that original article raised some interesting points which you have covered off here in an objecive manner.
Thanks for posting and sharing
Really great and nice post, Rand. I think a lot of people just disregard certain SEO beliefs as myths when they're partially true, and vice versa too.
But remember that some of those myths can actually become reality in the future. There is no guarantee, so it is important to see what major sites are doing - search engines tend to adapt to those sites.
Really sorry but I don't understand what you mean here? I seriously doubt that search engines adapt their algorithms to suit large sites. That's ludicrous.
I was shocked by the conclusions.
I feel that no absolutely right or wrong method in SEO.
and it's important to try it but not check these blogs all days.
Great post Randfish! There are so many different opinions on SEO practices, it's kinda like we're just wingin it sometimes. ;-) Good job at clearing up these ones...now I'll have to read the full list. ;-)
Cheers,
Kevin