Following Nick's lovely recommendations for community building, I'll be doing a bit of link baiting in this thread. Actually, that's a lie; I'm just ticked off about this article from SEO News' Rob Sullivan about a conference call he had with some Google employees. I'll just refute piece by piece, since several items are seriously problematic:
Is Pagerank Still Important?
The short answer is yes - PageRank has always been important to Google. Naturally they couldn't go into detail, but it is as I suspected. Google still uses the algorithm to help determine rankings... My feeling however is that they've simply moved where the PageRank value is applied in the grand scheme of things.
Real Answer - No. PageRank (what we view in the toolbar and the Google Directory) is virtually useless; it's old data, inaccurate data and only really useful to link sellers who think they can pull the wool over unsuspecting buyers' eyes (that's not to say all link sellers do this, but there certainly are quite a few). What MAY still be useful is global link popularity (the total value and importance of all the links pointing to your site/page). PageRank is supposedly a measure of this, but there's a big distinction and Rob should have made that clear. He does that little in another article that he links to, but even that one has plenty of other items I find objectionable - like, "guess which factor determines which top results are returned? You guessed it – PageRank." I'll have to deal with that another day.
Are Dynamic URLs Bad?
Google says that a dynamic URL with 2 parameters "should" get indexed. When we pressed a bit on the issue we also found that URLs themselves don't contribute too much to the overall ranking algorithms. In other words, a page named Page1.asp will likely perform as well as Keyword.asp.... The difference however is that in almost all cases I've seen the static URLs outrank the dynamic URLs especially in highly competitive or even moderately competitive keyword spaces.
This started out well. I give my clients the same advice - more than 1 dynamic parameter may still get indexed, but I don't recommend it. URLs themselves don't contribute to rankings - that also fits with my experience and knowledge. But, I can't get over how poorly the last line is explained. As MSN noted in my interview with them (and as competent webmasters around the web know), the reason that static URLs typically rank better is not due to anything in the algorithm specifically, but because they are more likely to be linked to and more likely to be used in important places (like homepages or top level category pages or big articles). Again, providing full information goes a long way towards not confusing readers.
Does Clean Code Make That Much of a Difference?
Again, the answer is yes. By externalizing any code you can and cleaning up things like tables, you can greatly improve your site. First, externalizing JavaScript and CSS helps reduce code bloat which makes the visible text more important. Your keyword density goes up which makes the page more authoritative.
Oh, brother. Not only is keyword density a nonsensical myth, but the idea that search engines aren't sophisticated enough to conduct the linearization and stemming tactics mentioned in IR books since the '70's is ridiculous. Just to put icing on the cake, what is it exactly about keyword density (or any type of keyword use or relevance) that makes a page "authoritative"? Authoritative means that it is a reference standard in the industry or niche - using more "keywords" or having more keyword importance can't possibly affect this. Links can, references can, mentions can, even popularity among visitors could arguably be considered, but keyword density? I'm not impressed.
In a very, very convoluted way, there could be legitimacy to this argument, but Rob doesn't use it. You could argue that cleaning up code makes people more likely to link to you and prevents errors which could cause spidering problems. It's a stretch, sure. But at least it's not flat out wrong.
Oh, and this was nice, too:
Any reproduction of this article needs to have an html link pointing to https://www.textlinkbrokers.com.
Why TextLinkBrokers? The article is hosted at SEO-News.com... I'm a little worried, as I've used and recommended this firm in the past - and generally been happy. I'm not blaming them, but it sure is confusing.
Let's hope that Rob's a big enough guy to admit his mistakes and make amends. It's something I've done many times (most recently with the ranking factors article, which took a lot of guff until I could get some top folks to help me out with it).
Hey Guys,
Showing a bunch of referrals from this site, so I figured I would check out what all the fuss is about. Looks like our Article campaign is working pretty well. We are getting tons of relevant links as well as a ton of clickthroughs :)
SEO-news is just syndicating articles that Rob writes for Textlinkbrokers, which is why they are required to link to us.
I wont get into a debate about the merits of what Rob is saying. Everyones got their own opinions on the subject of SEO, which is why I stopped debating with people a long time ago.
I will say two things though. Rob's article was only commenting on what Google had told him. You all know as well as I do that this information can only be somewhat trusted.
Rob is a smart guy and a very successful and respected SEO. Which is why I like having him write for us. I dont always agree with everything he writes, but he is consistently more accurate then most of the bologna out there.
As for trusting Textlinkbrokers to give advice about pagerank, I would submit this too you. Textlinkbrokers puts very little emphasis on pagerank for many of our link building programs, including the permanent link program which is our biggest. And as you can see we also do a lot of link building that has to do with articles etc.. we provide these same link building services to our clients.
However, pagerank is still a good rough indicator of "global link popularity" and a lot of our current and future clients demand that we include information about pagerank on every site that we list.
Jarrod Hunt
It would be interesting to test the keyword difficulty tool either without a PR weight or with a dramatically reduced value to assess its results
Well looks like I cant help myself but comment about your statement Rand, about externalizing code.
I completly agree with Rob on this. His main point was not that it increases "keyword density", his main point was that it reduces the chances for code errors. But at the same time he is 100% right about the fact that it will increase keyword density. He is not argueing whether keyword density is important or not, he is just saying that it will increase it.
As for the importance of keyword density, I will submit this. If you have 500 words of good quality, relevant content, and introduce 1000's of words of coding, you can seriously hinder that page from ranking well. It is very tough for a spider to interpret large and complex pages. Individual word density isnt that important but the density of relevant text vs. unrelevant code is.
Reducing the amount of unrelevant content(code) on your site WILL help to make that page more relevant for the text that you are optimizing for.
There is also one more reason for externalizing code, that Rob eluded too. It reduces the size of your pages. I have personal experience with this. About a year ago we reduced the average page size on our site from about 150k to about 70k just by externalizing code. Im sure non of you would argue the benefits of such a large reduction of page size.
Jarrod Hunt
Jarrod - I can't agree with this statement:
"If you have 500 words of good quality, relevant content, and introduce 1000's of words of coding, you can seriously hinder that page from ranking well. It is very tough for a spider to interpret large and complex pages. Individual word density isnt that important but the density of relevant text vs. unrelevant code is."
If my programmers can write code to parse even the worst of table-laden, poorly coded HTML pages, surely the search engines can do a much better job. And your point about Rob only saying that it improves KW Density but not actually suggesting that influences ranking is taking a page out of the worst of political double-speak.
I'm a fan of TextlinkBrokers - I've used them in the past and we have a good relationship. The quarrel isn't there. It's with this patently false information being circulated about how search engines and rankings operate. Let's be reasonable and call a spade a spade. I've certainly seen more misleading articles out there, but never one claiming to have Google as the direct source. It's precisely because the information comes from such "legitimate" sources (including Rob) that I'm concerned.
Hi Randfish,
If all SEO's agreed on everything, forums would be very boring :)
I for one will always believe that the less complex a page, the better. Sure, the search engines can parse just about anything that is out there, but that doesnt mean that making our pages easier to read isnt beneficial. One example is the time factor, SE's can and will only spend so much time indexing a sites pages. The simpler the pages on the site the more pages that can be indexed per crawl. More pages getting indexed faster, is always a good thing.
Ahh! You hit the nail on the head!
Any article discussing Page Rank which requires a link to text link brokers is SUSPECT.
Text Link Brokers is a service that gets paid for text links. What method do they use for determining the value of links? Page Rank!
So of course they are going to promote Page Rank!
Dave,
You make a point and a good one. The tools use PR because it's the only way to measure global link popularity that we have. I'm not saying it can't be used for that reason, I'm just not thrilled about seeing it used to discriminate for links, or promoted as a major part of Google's ranking algorithm.
With tools, we do what we can with what you're given, but in all honesty, I don't even have the toolbar on my browser because I believe it to be unneccessary.
But, Michael, Rob is not arguing for valid code or non-broken code, he's suggesting that "externalizing JavaScript and CSS" will improve keyword density and make your page more of an authority.
I'd agree that trying to fix busted HTML is no fun for any spider or indexing tool - that's not what I'm on about.
Sebastian - Seems like you would be the guy who knows. I'm impressed with all the work you've got on this subject at - https://www.smart-it-consulting.com/article.htm?node=133
Rand, are you certain it's a new article? If that had been written three years ago, I'd have agreed with a lot of it.
I think that put this article in your website can affect to you for duplicate content ...
I think that something have to say to defend pagerank ... :-)
PageRank is important ... no ... It is not very important for web positioning but it is important as a psychological factor to visitors ...
A visitor can say : Ohh ! a 5 pagerank site ! ... A lot of persons use pagerank as a tool for orientation in link exchange but I refused websites thar are not ethical or bad web designs or a tot of links or different thematic or not very much traffic ...
PageRank can be a factor but not the principal factor ... :-o
Rand: if you are so dead set against the importance and relevance of PR, I would remove PR measures from every one of your tools. your reliance upon them in various tools only validates the arguments that existing PR is important.
Dave
" My feeling here is that if you MUST use the Google sitemap to get your site indexed, then you have some serious architectural issues to solve. "
This --as an absolute statement-- is not true. Sure, an XML sitemap won't get a crappy site indexed. But that's not the point. Google sitemaps provide a procedure to get fresh content indexed in no time, what helps great (dynamic) sites especially. Aimed crawling reduces the time to index. Thus dynamic XML sitemaps are a MUST for frequently updated sites, I'd even say they are an essential part of a close to perfect Web site architecture.
And what you said.