We've been getting a lot of questions in Q+A and on the road at events like last week's Miva Merchant conference, Online Marketing Summit and the YCombinator conference about how to properly paginate results for search engines. In this post, we'll cover the dangers, opportunities and optimization tactics that can best ensure success. The best part? These practices aren't just good for SEO, they're great for usability and user experience too!

Why is Pagination an SEO Issue?

Pagination, the practice of segmenting links to content on multiple pages, affects two critical elements of search engine accessibility.

  • Crawl Depth: Best practices demand that the search engine spiders reach content-rich pages in as few "clicks" as possible (turns out, users like this, too). This also impacts calculations like Google's PageRank (or Bing's StaticRank), which determine the raw popularity of a URL and are an element of the overall algorithmic ranking system.
  • Duplicate Content: Search engines take duplication very seriously and attempt to show only a single URL that contains any given piece of content. When pagination is implemented improperly, it can cause duplicate content problems, both for individual articles and the landing pages that allow browsing access to them.

When is Pagination Necessary?

When a site grows beyond a few dozen pages of content in a specific category or subcategory, listing all of the links on a single page of results can make for unwieldly, hard-to-use pages that seem to scroll indefinitely (and can cause long load times as well).

Tiny scroll icon on Facebook
Clearly, I need to log into Facebook more often...

But, usability isn't the only reason pagination exists. For many years, Google's recommended that pages contain no more than 100 links (internal or external) in order to make it easy for spiders to reach down deep into a site's architecture. Many SEOs have found that this "limit" isn't hard and fast, but staying within that general range remains a best practice. Hence, pages that contain many hundreds or thousands of links may inadvertently be hurting the access of search engines to the content-rich pages in the list making pagination essential.

Numbers of Links & Pages

We know that sometimes pagination is essential - one page of results just doesn't cut it in every situation. But just how many links to content should the average category/results page show? And how many pages of results should display in the pagination?

Pagination-1

There are a lot of options here, but there's serious danger in using the wrong structures. Let's take a look at the right (and wrong) ways to determine link numbers.

Pagination 2

Pagination 3

Pagination 4

In some cases, there's simply too many pages of results to list them all. When this happens, the very best thing you can do is to work around the problem by... creating more subcategories! It may seem challenging or even counter-intuitive, but adding either an extra layer of classification or a greater number of subcategories can have a dramatically positive impact on both SEO and usability.

Pagination 5

Pagination 6

There are times, however, when even the creation of many deep subcategories isn't enough. If your site is big enough, you may need to have extensive pagination such that not every page of results can be reached in once click. In these cases, there are a few clear dos and don'ts.

Do:

  • Try to link to as many pages of the pagination structure as possible without breaking the 100(ish) links per page limit
  • Show newer content at the top of the results list when possible, as this means the most link juice will flow to newer articles that need it (and are temporally relevant)
  • Use and link to relevant/related categories & subcategories to help keep link juice flowing throughout the site
  • Link back to the top results from each of the paginated URLs

Pagination 7

Don't:

  • Show only a few surrounding paginated links from paginated URLs - you want the engines to be able to crawl deeper from inside the structure
  • Link to only the pages at the front and end of the paginated listings; this will flow all the juice to the start and end of results, ingoring the middle
  • Try to randomize the paginated results shown in an effort to distribute link juice; you want a static site architecture the engines can crawl
  • Try to use AJAX to get deeper in the results sets - engines follow small snippets of Javascript (sometimes), but they're not at a point where this is an SEO best practice
  • Go over the top trying to get every paginated result linked-to, as this can appear both spammy and unusably ugly

When in doubt, consider the directives you're optimizing toward - the need for fewer extra pages of pagination, the desire to make the browsing experience usable (many webmasters mistakenly think users will simply give up and search, forgetting that some of us can't recall the name of the piece we're looking for!) and the importance of maintaining a reasonable count of links per page. Also note that although I've illustrated using 5-10 listings (for graphical space requirements), a normal listings set could be 30-90 links per page, depending on the situation.

Titles & Meta Descriptions for Paginated Results

In most cases, the title and meta description of paginated results are copied from the top page. This isn't ideal, as it can potentially cause duplicate content issues. Instead, you can employ a number of tactics to help solve the problem.

Example of results page titles & descriptions:

Top Page Title: Theatres & Playhouses in Princeton, New Jersey
Top Page Meta Description: Listings of 368 theatres, playhouses and performance venues in the Princeton, NJ region (including surrounding cities).

Page 4 Title: Page 4 of 7 for Princeton, New Jersey Theatres & Playhouses
Page 4 Meta Description: Listings 201-250 (out of 368) theatres, playhouses and performance venues in the Princeton, NJ region (inclusing surrounding cities).

Alternate Page 4 Title: Results Page 4/7 for Princeton, New Jersey Theatres & Playhouses
Alternate Page 4: Description: -

Yes, you can use no meta description at all, and in fact, if I were setting up a CMS today, this is how I'd do it. A missing meta description reduces complexity and potential mis-casting of URLs as duplicates. Also notce that I've made the titles on results pages sub-optimal to help dissuade the engines from sending traffic to these URLs, rather than the top page (which is made to be the better "landing" experience for users).

Nofollows. Rel=Canonicals and Conditional Redirects

Some SEOs and website owners have, unfortunately, received or interpreted advice incorrectly about employing directives like the nofollow tag, canonical URL tag or even conditional redirects to help control bot activity in relation to pagination. These are almost always a bad idea.

Whatever you do, DO NOT:

  • Put a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL. You'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).
  • Add nofollow to the paginated links on the results pages. This tells the engines not to flow link juice/votes/authority down into the results pages that desperately need those votes to help them get indexed and pass value to the deeper pages.
  • Create a conditional redirect so that when search engines request paginated results, they 301 redirect or meta refresh back to the top page of results.

The only time I recommend using any of these is when pagination exists in multiple formats. For example, if you let users re-sort by a number of different metrics (in a restaurant list, for example, it might be by star rating, distance, name, price, etc.), you may want to either perform this re-sort using javascript (and employ the hash tag in the URL) or make those separately segmented paginated results rel=canonical back to a single sorting format.

Letting Users Display More/Less Results

From a usability perspective, this can make good sense, allowing users with faster connections or a greater desire to browse large numbers of results at once to achieve these goals. However, it can cause big duplicate problems for search engines, and add complexity and useless pages to the engines' indices. If/when you create these systems, employ javascript/AJAX (either with or without the hash tag) to make the pages reload without creating a separate URL.

Number of Rows Choices
(the Google Analytics interface allows users to choose the number of rows shown, though they don't have to worry much about crawlability or search-friendliness)

Also remember that the "default" number of results shown is what the search engines will see; so make that count match your goals for usability and SEO.

Additional Resources

If you have any thoughts or recommendations to share in the comments, we'd love to hear from you!