N.B. This post is intended for relatively new SEOs with smaller sites. Intermediate and advanced SEOs probably already know most of this information, and I do not claim to have experience with sites numbering thousands of pages, for which different strategies are surely more effective.
Some of you may know that I run an NCAA basketball website in my spare time from November thru April. Up until about three weeks ago, I ran the site entirely with static, hand-coded HTML pages. (Needless to say, a conversion to a CMS was long overdue.)
This year we posted Team Profiles for over 40 teams who made the NCAA Tournament field, and it was impossible for me to keep up with linking to them from everywhere they should have been, like the author’s bio page, sitemap, etc. So, I wound up with 40+ pages that were ONLY linked from the homepage, which at the time had a PageRank of 4.
Screenshot of homepage ~ 04/01/07. Links to Team Profile pages are denoted by light gray backgrounds. Google, in case you’re wondering, that’s an EDITORIAL link to TicketCity!
Over the summer, I was L-A-Z-Y and didn’t edit the site to cross-link those new pages throughout the site as I should have.
But lo and behold, when I went to 301 redirect those pages to their new Wordpress counterparts a few weeks ago, I noticed that a significant percentage (60%?) of these 40 pages had also accrued a PR of 4, the same rank as the homepage. The remaining profiles (40%?) had accrued a PageRank of 3. These are for pages that had NO other inbound links, other than from the Bracketography homepage.
I am well aware that PageRank ≠ SERP Rank, but this unexpected phenomenon has led me to the following strategy.
Best Practice Site Architecture for Small Websites, v2.0
You hear it all the time at introductory SEO sessions – “Keep your site architecture flat, so the spiders don’t have to crawl through multiple levels of links to get to a particular page.” Well, folks, I’m recommending in this post that you make your architecture flat as a pancake. True, it’s not as valid semantically, nor as useful for usability, but those homepage links are INCREDIBLY important for SEO purposes.
For the average small website, almost all of your link juice / PageRank will naturally accrue first to your homepage, from which it is filtered to your sub-pages via your site's hierarchy. (The exception is for sites with a lot of linkbait-y content, where there are a ton of links pointing directly at that content.)
See the diagrams below for the modification of traditional SEO strategy that I am suggesting. This flat-as-a-pancake strategy should be even MORE ammunition to use against designers who want to incorporate a splash page.
As I stated at the beginning of this post, this modification only holds true for smaller sites. Larger sites would confuse visitors and run out of screen real estate if they were to use this strategy.
Why I Recommend This for Smaller Sites: A Theory of Homepage Link Value
Most experts in the field feel that search engines typically use an algorithm along the lines of
.85 * (1/x)
in determining the value of a link, where x is the number of links on a page (i.e., the more links you have on a page, the less juice each link will pass). See Sergey & Larry’s famed original paper on PageRank.
The 85% at the front of the expression is what’s known as a damping factor, to ensure that the amount of link juice passed from truly unimportant pages on the web eventually approaches zero. And of course, Google has stated explicitly that its bot typically doesn’t follow more than 100 links on a page.
But in my experience, if one is talking about a homepage link, the algorithm might look something closer to
.95 * (3/x)
Notably, I think:
- The damping factor is lower (note that the # itself is actually higher, but the % of damping is lower)
- The number of links on the page does not affect the value of an individual link as much (Google knows that homepages by their nature have to get people to different parts of the site quickly)
Yes, this strategy has the potential to devalue your homepage slightly, if you include SO many links that your juice bleeds away too rapidly. But as long as your individual pages are well-optimized with good titles and headers, they should have a better chance than your homepage at showing up in the SERPs anyway, even with a slightly lower PageRank.
Has anyone else seen this kind of strategy either succeed or fail on their own site(s)?
Bonus Tip: If you’re new to SEO, you may not know that another way to funnel your link juice to the pages you'd like to rank well is to use "rel=nofollow" on links to pages you have no desire to rank (like your privacy policy, legal disclaimers, etc.). In an interview with Rand awhile back, Matt Cutts said this is perfectly legitimate – see answer #2.
David Mihm is a small business SEO + website designer based on the U.S. West Coast.
David,
great job and nice graphics.
I'd have to concur with Rand. Site architecture is something that many people blow right on by without giving it a second thought... immediately focusing on title tags, h1's, anchor text, and all the other usual suspects... but if you don't start with a good foundation, you'll never reach potential.
But this is even more critical because that site you start out today with 20 or 50 pages, will probably grow to 100, and on up from there. If you can build your "eventual" site structure into your initial one, you'll be way ahead in the future.
Similar to Randall's mention on siloing, as your site grows you can take main pages and build them out into strong thematic category pages... essentially sub-homepages that may can be built up to be strong enough for building IBLs to them.
Many approaches to that, but one is to find a way to incorporate RSS into each of these silo heads... news, blog, products. Then focus on syndicating these feeds with a link to the main page as well as the individual pages. Then focus on freshness and quality of what you are feeding into the feeds and you've created scalability.
Identity,
Great point about keeping scalability in mind. I think it's important to have this in the back of your mind when you're starting any site. If you can come up with a flat structure at the beginning that allows for a more siloed version down the road, that's the best of both worlds.
Another point about homepage links is that even on larger sites where silo-ing is essential, you can use them to funnel juice to particular "product" pages that may be of temporal importance, like for the holidays...
(A site may not sell products, but maybe there's a blog post you wrote last year buried three levels down in a silo that is again an interesting read because of a recent piece of news, etc.)
Yes, this is true from my own experience too. I know most of us don't monitor supplementary pages anymore, but I also found that pages linked to from my homepage didn't enter the supplementary index; similar pages, if not linked to from my homepage, did enter the SI.
My strategy, rather than be quite as "flat" as you advocate, has been to work on getting links to second- and third-tier pages. The effects on PR may not be quite as pronounced, but it does get you more targeted traffic to your interior pages as well. I also interlink pages at my site as much as possible, to spread the PR around.
While they might have usability problems, dropdown menus would probably be useful with the sort of site architecture you're talking about because you can have all the links from the home page (and every other page), but not have to worry about using up so much screen real estate.
Except in browsers with JavaScript off and robots who don't undersand CSS (if that's the type of dropdowns used). Sure, you'll include all those links between the <noscript></noscript>, but you'll still have all those links on one page, diluting the juice... I donno, Id stick with David's recommendation - this would work mostly for small sites.
Interesting - im just buys redoing my internal linking structure. i have been flattening it. (just learnt thats how we term it ;) ) I have 4 primary subject tiers: (hope it makes sense)
Level 0: > Home
Level 1: >> Subject 'A' (5 more or less static articles on this level)
Level 2: >>> Sub-Subjects of 'A' (Blog Style) (Partially also Level 1 as home page will have 4 featured posts - One from each subject tier.)
-------------------------
Level 1: >> Subject 'B'
Level 2: >>> Sub-Subjects of 'B'
etc...
Still busy implementing it but i have a little more confidante that it will be successful after reading this. Thank you.
Interesting post David. I tried some experiments in the past creating a flatter architecture. I didn't link to deep pages from my home page, but to my blog's home page which probably has an equal amount of incoming links as the home page.
I did notice pages I linked to directly from the blog's home page received a boost in ranking not long after.
Couldn't agree more. It's been my experience over the years that this method holds true. I have had quite a few pages suddenly rank for the terms needed, within days and weeks in Google. Just from the flat design and links off the home page.
Interesting.
My own experience with a very small, totally flat site just recently achieving PR for the first time - guess when :) - is quite different.
Although the site in question has a visual hierarchy (5 top level items, but with a "most recent articles" sidebar which appears on every single page and actually includes all articles on the site at present) the pages that are visually designated as the top level have PR3, whereas the pages in the sidebar have yet to achieve PR. This despite 4-5 of those articles receiving most of the inbound link love.
I suspect the PR algo isn't sophisticated enough to differentiate between the sidebar and top nav links (in terms of the markup they have pretty much equal weight...I'm confident a robot would have a hard time guessing which is the main nav and which is a sub nav).
However, there is a page on my site which lists all the articles in the sidebar along with a brief descrption in addition to the link title used across other pages. At the moment my conclusion is that is what is causing the weird PR distribution, but I intend to do some tweaking and fine out.
Anyhow, thanks for the article! I'm currently buidling up my play book for better PR shaping and this is highly relevant.
This is a great outline for smaller sites. It's funny how some people want to re-invent the wheel and make things harder than they need to be. Keep things as simple as possible and it will make your job a lot easier.
Nice work David!! Fantastic post. So who are you picking to win this year?
Thanks, iSearch. I think it's going to come down to UNC vs Kansas; as much as it pains me to say it, I'm predicting Bill Self gets his first title this year.
But a LOT can happen between now and April 7! :)
I had a similar experience with 14thC. At first it was static and then I kept adding content and keeping up the link management... well... didn't really happen. So I redesigned with includes and made category pages that turned into sections.
When that happened I siloed those sections. Sure the top page gets a link from the primary nave but the sub pages have their own nav structure as well. It seems to help with relevancy and distribution of PR. I just changed the structure of my site to be entirely in WP for ease of management, right before the last PR update but befor it most of the pages on the site had a PR4 with a couple PR2 and PR3s thrown in. The blog posts weren't faring as well though for lots of reasons.
Thanks, David.
I had already implemented this strategy, somewhat, on my site.
I also realized going through my mess, that one of my "dead" pages was receiving alot of juice. Fixed that in a hurry.
Rand linked to an article awhile back, that stated you should redesign your site at least twice a year. Had I done that a little earlier, I would probably rank higher by now.
Some have said that PR is just a number and does not affect rankings that much...Is that true or untrue, in your experience?
Come March Madness, I will be hitting your site on a daily basis.
Go DUKE!
If not a full on re-design, at least a re-assessment of your most lucrative opportunities for rankings and traffic.
PR does not NECESSARILY affect rankings, in my experience (i.e. I've seen PR 3 pages outrank PR 6+ pages because of better targeting and better incoming anchor text), but I do think it remains a good heuristic for how important Google thinks a particular page is.
I think of it as kind of a "potential" for ranking more than anything else. If you have a PR 1 or 2, the likelihood you're going to rank for a competitive phrase, even with amazing on-page SEO, is pretty low. But if you have a PR 5 or 6, there's plenty of potential to rank if you play your cards right.
I have several sites that rank well for pretty competitive terms, with only a 1 or 2 PR (as of the recent change, anyway - was a 4 previously). Not sure why they dropped PR, but they still rank at the top of the SERPs, so I really don't care much about PR.
It's not so much that, as whether Google will give it an extra push in a competitive category. I was wondering whether, all things being equal, it would serve as tie-breaker of sorts.
That's where I get conflicting information.
I'm in the top 10 on several of my KW, and I'm making the push for No. 1. I want to do whatever white hat things I can to get there.
Bigbuy, if that's the case, I wouldn't be so concerned about PageRank. I'd be more interested in getting some inbound links with the exact anchor text of the keywords you're trying to rank for...
Got it...I'm just trying to squeeze the most out of every ranking factor I can find...and I figured that since you had plenty of experience in the industry...you would know if that could possibly be a factor...
Thanks for the answers...
We're competing with people who have the KW as their domain name...rough seas my friend...
any experience in that category?
Rand linked to an article awhile back, that stated you should redesign your site at least twice a year.
Does anyone know where that article is?
It's called "20 Hard Core SEO Tips".
Parts of it were controversial... I tried searching Rand's posts and my comments for a while, and then checked to see if I had bookmarked it.
I did.
click here for post
Thanks!
I use this things called a "sitemap". It seems to really help getting those deep hard to index pages in the result rankings :) Nice article BTW, I have been working on a new content heavy project and I have been examining our site-structure to make it as simple as possible so I can scale it too the the roof.
-Dal
Great article David and props for getting it on the main blog!! This article came just in time considering I am developing my small site as we speak.
Nice article David. For a very small site, I could see where this approach could be both manageable and effective.
Thanks, Sean.
As I'm reading it "in print" for the first time, I need to add that I'm by no means suggesting subpages should ONLY be linked from the homepage. They should also be cross-linked from other subpages, of course. I'm just trying to use this example to point out that homepage links are really, really important.
I'm just a few weeks pre launch, so this article came at a perfect time for me. Thanks!
This is an excellent post, David, and I think it's really geared for a more advanced audience than you might think. Beginner SEOs need to know what title tags are and how to make good URLs :)