I'm currently in the process of re-authoring and re-building the Beginner's Guide to Search Engine Optimization, section by section. You can read more about this project here.


Redirecting Pages for Users & Search Engines

On the web, as in life, the only constant rule is that nothing is constant and all things must change. However, on the web, when things need to change, or, in our case, move, specifically from one URL to another, there are critical best practices to observe.

Let's first assume that you have a simple scenario - a URL that needs to re-point to another address permanently.

Illustration of a Redirect

There are multiple options for accomplishing this feat, but in general, a single one, the 301-redirect, is preferable for both users and search engines. Serving a 301 indicates to both browsers and bots that the page has moved permanently. Search engines interpret this to mean that not only has the page changed location, but that the content, or an updated version of it, can be found at the new URL. The engines will carry any link weighting from the original page to the new URL, as below:

Googlebot successfully follows a 301 Re-direct

Be aware that when moving a page from one URL to another, the search engines will take some time to discover the 301, recognize it, and credit the new page with the rankings and weight of its predecessor. This process can be lengthier if your page hasn't changed in a long time and the spiders rarely visit it, or if the new URL doesn't properly resolve.

Other options for redirection, like 302s (temporary redirects), meta refreshes, or Javascript are poor substitutes, as they generally will not pass the rankings and search engine value like the 301.

Transferring content becomes more complex when an entire site changes its domain or when content moves from one domain to another. Due to abuse by spammers and suspicion by the search engines, 301s between domains sometimes require more time to be properly spidered and counted. For more on moving sites, see Expectations and Best Practices for Moving to or Launching a New Domain.

Server & Hosting Issues

There are, thankfully, few server or web hosting dilemmas that affect the practice of search engine optimization. However, when overlooked, they can spiral into massive problems, and so are worthy of our review. The following are server and hosting issues that can negatively impact search engine rankings:

  • Server Timeouts - If a search engine makes a page request that isn't served within the bot's time limit (or that produces a server timeout response), your pages may not make it into the index at all, and will almost certainly rank very poorly (as no indexable text content has been found).
  • Slow Response Times - Although this is not as damaging as server timeouts, above, it still presents a potential issue. Not only will crawlers be less likely to wait for your pages to load, but surfers and potential linkers may choose to visit and link to other resources because accessing your site becomes a problem.
  • Shared IP Addresses - Lisa Barone wrote an excellent post on the topic of shared IP addresses back in March of 2007. Basic concerns include speed, the potential for having spammy or untrusted neighbors sharing your IP address, and potential concerns about receiving the full benefit of links to your IP address (discussed in more detail here).
  • Blocked IP Addresses - As search engines crawl the web, they frequently find entire blocks of IP addresses filled with nothing but egregious web spam. Rather than blocking each individual site, engines do occasionally take the added measure of blocking an IP address or even an IP range. If you're concerned, search for your IP address at MSN/Live using the IP:address query (or SEOmoz's Who Else is Hosted on My IP Tool).
  • Bot Detection and Handling - Some SysAdmins will go a bit overboard with protection and will restrict access to files to any single visitor making over a certain number of requests in a given time frame. This can be disastrous for search engine traffic, as it will constantly limit the crawling ability of the spiders.
  • Bandwidth & Transfer Limitations - Many servers have set limitations on the amount of traffic that can run through to the site. This can be potentially disastrous when content on your site becomes very popular and your host shuts off access. Not only are potential linkers prevented from seeing (and thus, linking to) your work, but search engines are also cut off from spidering.
  • Server Geography - This isn't necessarily a problem, but it is good to be aware that search engines do use the location of the web server when determining where a site's content is relevant to from a local search perspective. Since local search is a major part of many sites' campaigns and it is estimated that close to 40% of all queries have some local search intent, it's very wise to host in the country (it's not necessary to get more granular) where your content is most relevant.

I'm actually really excited to have this section finished, as it means I can start diving into some less dry, more fun material with the next few chapters :)

BTW - If you haven't yet taken our survey, please do! And yes, expect to see me post this at the bottom of every blog post for the next few days to help encourage participation. We're hoping to get 3,000 or more responses, which would dwarf the sample size of something like the SEMPO report (which garnered 587 responses for 2006).