Someone mentioned to me today that Bruce Clay has done a great job of optimizing his site over the past year, and reclaimed some of those top positions for broad level SEO terms. One of these is search engine optimization, where Bruce ranks in the 2nd spot:
When I analyze why a site/page is ranking well, I'm typically looking at:
- Page Strength, which is good for a quick overview - 6.5
- Links to the domain via Y! Site Explorer - Bruce has - 41,314
- Number of unique domains in the top 2-300 - I estimate around 50% in the top 250 are from unique domains
- Ratio of high/low quality links - usually I do this with a manual review
- Who's linking and why (natural, manipulated, paid, etc.) - again, manual review req'd
- Anchor text (the neat-o tool is a must for this) - lots of instances of "search engine optimization," which surely helps :)
- Links to the page from Bruce's site - 1,070
- Links to the page from external sites - 7,757
- Links that Technorati knows about - 1,206
- How frequently are new blog links appearing (just look at the included timeline next to the links)
- What authority levels do those links have (sort by authority to see)
- A bit of KW usage data about the page - The Ranks.nl tool is still the best
- Google PageRank of the domain and page (not a great one, but it's still a signal) - Page = 6, Domain = 7
- How it ranks at Ask.com (with their picky local pop algorithm) - 5
- The strength of who else is ranking in the top 10 (WBP's cool seo tool is good for this, as is SEOmoz's own KW Difficulty)
All in all, Bruce's page is a solid competitor with a lot of strong factors backing it up. My only concern with the page as it stands now would be its effectiveness over time in attracting new, natural links. If you look over the page from a human perspective, my guess is that very few folks would be likely to send links to that page after visiting - it's more sales and navigation than resource.
What's your strategy for investigating a site's ranking ability?
I think you gave a great break down on this RAND! I couldn't have done this any better!
Rand, can you turn down the volume on this blog? There's too much noise here and I'm getting lost.
Really? Of all the blog posts we've authored recently, I thought this one was among the most "signally" and useful.
What can I say - SEOmoz readers are a very tough crowd to please.
I agree. This post is pretty on track.
Great post and fascinating comments!
I was being sarcastic....I guess if you have to explain a joke then it's not that funny. Speaking of that, what's with your "turtle and the hair" comment below, Aaron? That one flew right over my head.
I sure hope Aaron is referring to the story of the turtle and the hare - i.e. slow and steady wins the race and not "the story of the turtle and the hair".
Which one is more better for Keyword Usage Data? Ranks.nl or gorank.com? I'm using gorank and the data's are totally different compared to ranks.nl. Which one is accurate?
https://www1.zuula.com/SearchResult.jsp?st=sea...
Bookmark the above link - it is really helpful for instantly analyzing the SERPs or Google Yahoo MSN Ask
What is really phenominal is the few SEOs that managed to well on ALL of them. :-o :-) :-D
Anyone ever read the story about the turtle and the hair?
;o)
Definately backlink analysis of the site can give good information or at least raw data to know about the potential of a site to rank for a particular keyword(s). But it has limitation of maximum 1000 results {standard} from three top search engines. And for sake of deeper analysis of a site in restricted environment {max 1000 limit} .. i try to assess true authority of websites linking to site in question in the niche. I use backlinkwatch tool for deeper backlinks analysis with outbound links count and pr information. And then some custom home written tools to generate sophisticated reports for quick information.
May be they have done Strong SEO with white hat techniques.
Hello fellow SEO Specialist: I notice the same factors that you describe above. I also know from experience, Bruce Clay has major pull with Google. In other words, its not what you know but who you know. This is not only true for Bruce Clay, but for many websites listed with Google.
However Google would say my statment is fale. Don't take my word as fact, do the research and watch the light bulb go off. This is the American way, not always fair but always equal opportunity.......Ha!.....Ha! Ricky1
If you are surprised at Bruce Clay ranking so high, try searching for "SEO" and look for ihaveawebsite-nowwhat.co.uk
Google.com = 9th Google.co.uk = 3rd
According to SEOmoz page strength tool it only has a page strenght of 3, but ranks higher than the likes of MattCutts.com, SEOmoz.org, and Bruceclay.com
some stats for ihaveawebsite-nowwhat.co.uk: Page strengh = 3 Links = 3368 Domain age = 1426 days Technorati links = 26
None too impression, eh? So how they do it?
J.
It's interesting how Bruce Clay and SEOmoz keep writing about each other!
I'll tell you one thing regarding that last post of theirs on the noise coming from blogs; perhaps if they opened up their blog to the public and let readers comment, it would not feel so eerie & dead over there. Seriously, I know the folks over at Bruce Clay are reading this, why so unsociable? Your blog is so Web 1.0
I think many of us would take SEOmoz’s happy hour atmosphere any day over their dark alley soap box.
Come on, it's not such a dark alley soap box. Lisa has a warm and effusive femininity about her. I read her blog just to get a little sunshine in my day. After I click away there are often little rainbows over my computer. And sometimes purple ponies scamper.
:-) In all seriousness, I'm a Lisa fan. A recent post of her's about "coyly staring" at a friend of mine was killer material - guaranteed laughs for us at his expense.
Lisa you go girl!
ok, who crashed the Neat-O tool?
Since SERP climbing is all about who you're outranking, strong page strength is more relative to competitive pages rather than individual results...
I wondered too, Rand.
Bruce Clay used to have some strong tools and content (like the SE relationship chart) but it seems like it's been a while - and there are sites that I'd think were much better connected now.
SEOChat is another mystery to me. Yes, it has a lot of tools, but... ?
SEOChat has a massive network behind it that continues to generate relevance for them (and links). Their partnerships (or ownership, more accurately) is really do the bulk of their link work for them. I was quite impressed to see Bruce beating them out, actually.
I think the SEOchat thing is a ton of UGC. That domain has huge amounts of content, and members often linked to the site's discussions.
When they pulled all the outbounds off all those links, they basically pulled the plug on a lot of recprocity and linking to bad neighborhoods... and the v7n contest and their directories were rather huge too.
Quality, Age and Format
What criteria do you use to establish if a link is low quality or high quality?
I think a lot of it is based on experience - you follow those links, have a look at the site, and in a matter of 2-3 seconds have a good feel for the "legitimacy" and "quality" of both the site and the link it's providing.
While Bruce Clay is definitely worthy of being in the top 20 based on his optimization and link profile. I don't think he would be in the top 10 if it wasn't for his 1997 domain name.
and...
I wonder if his recent rise to the top of the ranks has anything to do with him adding a "noindex" to all of his linkmap pages?
Honestly though, I have never understood why in the world you would want to link to everyone that was linking to you. Basically creating a reciprocal link out of every natural link coming into your site.
It appears that Bruce now believes the same thing.
Take a look at his linkmap page, then look at one of his clients. hmmm, very interesting.
https://www.bruceclay.com/LinkMaps/
meta name="GOOGLEBOT" content="FOLLOW, NOINDEX"
https://www.cars4causes.net/LinkMaps1.asp
meta name="robots" content ="index,follow"
I normally wouldn't "shake the boat" like this but I recently had a client get de-indexed from Google and a lot of it had to do with him using linkmaps. Only to find out that Bruce has a "noindex" on his own linkmaps...... Things that make you go hmmmm.....
I'm sure Bruce had good reasoning behind his linkmaps when he developed them back in the days of Altavista, but seriously, times have changed.
I always wondered about that linkmap thing. I also have seen a couple of people get stung by it. But it could be the wrong implementation of it.
I'm having a tough time understanding the goal of that linkmaps system as well, particularly on the clients' sites. Maybe it's just a tracking system for inbounds? Part of how they request links? It's certainly odd that they'd keep it public - great find Jarrod.
Rand,
They purposely make the linkmap pages available for the search engines to see. The idea is that some of the pages that have your link on them, may not have been seen by the search engines yet, so if you link to those pages, you are helping the search engines find links to your site.
In their own words:
Unfortunately, Linkmaps also have the magical ability to get your site penalized because they make it look like every link you have ever gotten is a reciprocal link.
EEEK
I swear to everything I hold holy, like chocolate donuts, that I'm not trying to bash on Bruce Clay. I think he's a great SEO and has got plenty to offer.
However, I remember distinctly his having to defend against almost every member of the SEW forums when he discussed the Linkmap system.
But Jason, it might be working because of the NOINDEX, FOLLOW tag he's got in there. Isn't the noindex tag helping him negate the recip penalty?
What do you think Rand?
Rumblepup,
The addition of the "noindex" tag on Bruce's LinkMap pages is a new one and negates the whole "original" purpose of the LinkMap.
I'm assuming that Bruce placed the "noindex" on his LinkMaps once he discovered the damage they were causing to his rankings.
The question is, is he continuing to sell the product to customers, and if so, do they also have the "noindex" tag on them? Which would make no sense considering the only purpose for linkmaps(helping search engines find incoming links) is completely negated if there is a "noindex" tag on them.
Let me add one additional thing.
It's obvious Bruce didn't intend his product to have the nasty side effect of turning all of those "pesky natural one-way links" into "quality reciprocal" links but unfortunately thats what it does.
I think it's time for a product recall, I cringe to think how many people's rankings are suffering because of this major side-effect.
I agree with your argument, I just had to double check.
Nice find Jarrod, whatever Bruce is doing Google likes it currently, Bruce is an SEO after-all right?
(ok enough posting in here today, sorry if I am bothering anyone)
Whoah. That is a eye opener. I couldn't be at SES, but I watched the WPN videos. I know, sorry excuse for and SEO student.
So, is this why those noindex pages carry PR? Pretty good PR too, from what I could see.
The theory behind LinkMaps is plausible, if that "noindex" principle actually held. The problem is that it doesn't. If not carefully maintained the tool can be misused and break down. If dealing with thousands of links (especially custom-built links), that tool defeats its own purpose in the end.
The LinkMaps tool is predicated on the notion that you can link out to people without actually "validating" their sites. Google will then all of a sudden increase your # of backlinks because they are being informed of some that were previously unrecognized. In other words, you can link out to your "inlinks" without actually endorsing their sites.
The problem is that the Search Engines will count that as an "endorsement" of the low-quality sites, even with the "noindex" tag.
If a bad site ends up on your LinkMaps pages, then your site will be linking to those bad neighborhoods and thereby be associated with those linkfarms. That's First-class ticket to DelistedVille (or is it DelistedBurg? Either way I'm sure it sucks there).
The engines themselves (from the Q&A on Links at SES Chicago), said they are always going to follow the links on a "noindex" page anyway. If you use LinkMaps to let the engines know of everyone that links into your site, you must be absolutely sure that you're only including the higher quality & Legit sites. As I mentioned in the beginning, when dealing with thousands and thousands of backlinks, that's an incredibly manual process that defeats the purpose of using the tool. It would be safer & more intelligent to either submit those sites linking to you, or just allow the engines to find out about them on their own.
Sorry for the length, but I feel really strongly about this particular issue.
***
Now on the subject of the particular page ranking so highly for "search engine optimization", I do have to give credit where it's due: BC has been great at crafting content to adhere to the semantic theme of keyphrases.
If you look at the page, they systematically knock out a ton of relevant subject matter with highly relevant yet highly usable & readable language. Regardless of philosophical and theoretical differences, one can't really argue with precisely executed content.
One of BC's strengths has always been the development and creation of extremely keyphrase-relevant content that does the job of marketing at the same time.
Combined with age and some backlinks--they have about 100x the *page* content of everyone else on that SERP. I bet if Jill Whalen added an equal amount of relevant content to that HighRankings homepage she'd jump a couple of spots as well...
Ah, that clears things up for me a bit as well.
I've gone through some sites that are using his linkmaps and see that most of them do have:
"meta name="ROBOTS" content="FOLLOW,NOINDEX"
It's good to see that he was at trying to make the product work. Although as you pointed out Abhilash, even with the "noindex" Google is still going to count the outbound links, which defeats the whole purpose of trying to hide the pages and links with the "noindex" tag.
Interesting.
At SES Chicago, a few speakers noted that nofollow means the link won't be used for assessing the relevancy of your site, but the link will be followed for purposes of discovery. It's not a vote from your site, but it's still an opportunity for a spider to find a new page.
sorry, a bit off topic, but the top sponsored listing "Really Bad SEO" cracks me up. who would actually use that as the title? and who's going to click it? i'd like some really bad seo, please.
It clearly has the highest CTR, because I doubt they're paying through the nose. It doesn't really surprise me - in an arena like SEO, the ad that stands out the most is going to get the clicks, even if it's unorthodox (or, rather, precisely because it is unorthodox).
spot on I'd say...
good ole marketing... sometimes the best way to get attention is to use something that you don't expect to see, coming from the opposite end of the spectrum, to cut through the ad-blindness
Okay, so they have a high CTR, but who's clicking? Potential clients or curious SEOs? So the question is, what is their conversion rate?
I guess I'm just wondering if they are really targeting that message or just doing some sort of test.
I just can't see any company seriously looking for SEO services to click on an ad that says: "Really Bad SEO".
Am I missing something here? Should I start marketing my clients sites and my own as being "really bad"?
They're playing into the fear motivator.
People looking for SEO are seeing all kinds of claims about getting top rankings, #1 on Google in a day, etc.
So they start feeling skeptical, but a bit enticed... then they see an ad that says "lookout, there's a lot of bad stuff out there, talk to us first before you make a move." In that way, they aren't even directly selling services to anyone, more approaching the notion of a confidant, looking out for your interests. So you might be prompted to at least take a look, or you might think they have a list of bad SEO firms on their site, so it might be worth seeing who's on the list.
Is it working? Who knows. But seems like any time you can differentiate your message amongst all the others has to score you some points.
Actually Kimber, it's a very good ad.
The headline catches attention and the description makes you click on the ad. Notice that what they are doing is they are trying to become your "friend" .... helping you in the world of "enemies".
In my opinion ... it's a pretty good ad.
Rand, they wonder about you too.
https://www.bruceclay.com/blog/archives/2006/1...
Ahh the incestual world of SEO :)
Funny how that is. ;)
However, you mention that as a concern, the page will start to loose new, natural links. My guess as well. But doesn't the age of a page, or site, slowly but continually add weight?
I thought that Google weighted history as well.
I usually look for the same things you do in investigating a site's ability. Primarily, I look for on page factors and then quality of the IBL's.
To me, a good IBL is one of two things. Either PR and % of links on page, or subject relevancy, especially when the link is contextual. I kinda' dig those two kind of links.
Well, I would dig a text link from the front page of The New York Times as well, but that's just me.
Ah, yes, those were the good old days. Back when it was still possible to outrank Wikipedia.