This morning I woke up at 3:45am (just about the time I usually go to sleep), kissed my fiancée goodbye, and jetted off for a day-long meeting in Phoenix, AZ with the folks at Village Voice Media (operators of 16 weekly newspapers and many media-focused sites and blogs). During our chat, the issue of pagination for article content reared its ugly head, and since this particular issue troubles so many CPM-revenue focused sites, I decided it needed to be addressed (despite the fact that I touched down in Seattle at 8pm, taxied over to Vanessa's for Buffy Tuesday, and desperately need some sleep).
Here's the fundamental dilemma:
- CPM Revenue is based on page views (because more page views mean more ad views).
- Splitting up lengthy news articles (or any lengthy content for that matter) from a single long page into more digestable sized pages is not only good for user experience (depending on your users' preferences), but also promotes more page views, and thus more ad revenue.
- From an SEO perspective, splitting up articles into multiple pages is generally bad, because:
- Most links (internal and external) will only point to the first page of the article, so subsequent pages get comparatively little link juice
- Creating unique title tags (or meta descriptions) for multiple article pages presents a significant problem (what else are you going to call it, but "article title page 2 of 12"?)
- If the entire content were on a single page, you might generate far more mid and long tail search traffic to the page, because those keywords would all be on a single, more competitive URL
- What's a publisher to do?
For those who might not be following, I've drawn up some handy images to help illustrate the problem:
There is a clever solution to this issue, but it veers into territory that some worry is gray hat - using CSS layers and javascript to keep all of the content on a single page, but make clicks to view the layers that would have been page 2, page 3, etc. refresh the ads and "count" as a page view.
In this example, the search engines only get a single URL, with all of the content, and the visitors get, what could certainly be argued, a more usable page, while the publisher gets what they want - the maximum number of "page" views. One large media website that used to use exactly this tactic was the International Herald Tribune. However, it appears that now, they too are creating multiple URLs for their paginated material, and seeing those results in Google (example). Quite the pickle...
I really have to come down on the side of usability on this one. I personally can't stand having to click the little "1, 2, 3" page numbers or the "next page" link beneath an article. Even if I'm rather enjoying the read, it's enough of a disruption that half of the time I'm gone. Obviously my single opinion doesn't constitute a usability study, but I'm going with my gut on this one: users don't like clicking for another page to continue reading.
I'd have to recommend using a single page wherever possible. Sure, with additional pages or JS-driven CSS layers you can go after more revenue, but Pat's point above about this being slightly spammy is right....in addition to the fact that you're trading usability for a bit more ad revenue. In the long term I don't know that's going to be in your benefit.
In any case, great post with very helpful illustrations (as usual). Thumbs up.
There's a mistake in your logic that a longer content rich article stands a better chance of getting more search engine traffic. An article split over three pages with proper seo in place (make them write their articles with three main heading, then use those to create the meta title/h1 on each new page) stands a far better chance of picking up long tail terms, especially in news where the freshness will play a big factor. Not only that, but unless they are one of the top 5 news sites in the world, going after the long tail is a much more sustainable plan in the long run.
Totally white hat, generates as many, if not more, search engine referrals and gives them cpm they need. You might want to do something with pr leak over time though.... ie. make sure links to deeper pages only come from first page.
Million dollar tip; you get it for free :)
ps. from a usability point of view all the extra page views blow. But we all know that Google's old mantra of putting users first isn't even followed by Google itself, so unless they are concerned about being a really cool, hip company that puts their users above everything else, this is probably going to be a much lower priority :)
After reading through all the comments, I have to say, I'll go with SamIm's way of titleing the split pages with their own unique subtitles on my article.
Rand thanks for this post. Just what I needed.
Sam - I respectfully disagree with your opinion that splitting the pages gets more long tail search traffic. We've tried it both ways on two different client sites (actually, once it wasn't us, but the client had the testing data before we started working) and in each case, all the content on one page meant more referrals from engines. It makes sense logically, too - consolidation of link juice / PageRank / trust / authority / etc is almost always better than dilution.
One Edit - Sam, your scenario could be dead on if the content were split into unique sections that functioned almost like individual articles, where people would read them, use them and link to them independently of the larger piece. I think that might work well for "how-to" articles, but probably not well at all for "news" items.
Is there anyone in the world who likes having to read one article one 3 pages? One of the most annoying things about the web is the arbitrary splitting of a single article into multiple pages, when it is obvious the only reason to split is to inflate page count.
A few suggestions:
#1) On all pages subsequent to the first, nofollow EVERYTHING but the link directing to page #1 (and any links in the content, naturally). Nofollow the contact page, the link to page #3, the sports section, etc etc. This turns pages 2+ into linkjuice funnels which siphon the juice directly to the best optimized page for it. Granted, it isn't as good as page #1 being the preferred choice for all linkers, but what can you do?
#2) Oh, have your "Mail a friend a link" / "Permalink to this" tools return page 1. That will help you significantly, and it is the best result to your users (most of whom conceive of articles as articles rather than web pages).
#3) Either have your editors write up N subheads for an N page article, and make the titles and descriptions match, or algorithmically discover the subheads by examining the article yourself, and write them that way. Obama Sweeps Georgia | Super Tuesday Results | Patio11 News Service makes a quite decent title tag for page 3.
#4) Supplement pagination, which I think is a terrible user metaphor, with a human readable table of contents link table. This gets your section headings all onto page 1, to help it rank for the terms in them, drives juice to pages 2+ for the proper terms, and is a win for usability as well.
#5) Teach your editors to write subheads with SEO in mind, too.
I am glad to see this topic get covered.
I have moved several articles back and forth between one page and multipage formats. I have done lots of testing and my results say that Rand is right, the long articles get more total search traffic and they often rank higher for their main terms. I disagreed with Rand on this exact topic in an earlier post and I am eating a small crow right now :D
Average visitor time on site (of those who arrived through search on the first page of the article) did not change very much - which suggests to me that multi page articles don't keep interested readers from staying on the site. Visitors who entered the site from search on the second or third article page had a very high bounce rate (even with easy to find links to the first page of the article.)
Ad income per visitor was lower on the single page articles. However, most of that income loss was recovered by higher visitor counts.
A somewhat intangible metric (and another form of income) is linkabilty. In my opinion this is better for the long single article pages.
This is a topic where there is not a simple yes/no answer. I have some situations that go the other way.
Most important in my opinion is the quality of your traffic and the quality of your authors. If you have low quality traffic pouring in from search, you better make your money from the single pageview. However if you have high quality traffic who click into the article, not from search, but from a link on your own site, and a high quality writer who can pull the visitor into multiple pageviews then you might make more money on the multipage articles.
Your comment got me thinking about a fallacy in the multi-page article thought process, from the ad/marketing perspective. The assumption is that a 4-page article generates 4X more page views, but that's only the case if people consistently click through the entire article. Not all of those 4 pages are created equal.
The irony is that clients are putting 4 ads on 4 pages (just to stick to my example) because they want all of the ads to have the same "spot" (above the fold) and weight, but that's an illusion. Even though the ad position is identical, page 1 is probably a significantly more lucrative spot to be on than page 4.
Right, most visitors view only one page.... a smaller number view two... a smaller number view three.
So, it does not take a massive traffic boost to equalize the income.
I totally see how this can be accomplished using javascript (including reloading the ads with each page link), but it still sucks for user experience. In my experience user experience is the most important asset in getting page views anyway. In the long run you will attract more readership (and thus serve more ad impressions) by designing for users.
I disagree with those who think that three pages has more virtue for seo with long tail targeting than one long content page. One single page will attract consolidated links and can probably outrank the 3 link-diluted pages even for long tail terms.
I am as far from an expert as possible, however, we have content that has to be paginated for faster loading and issues with our server load. As a result we have no choice. Also, our revenue stream is from ad impressions. We have had limited success with using AJAX to refresh the ads along with some content to NOT lose impressions, however, the other "marketing" answer to this, if you are CPM-based and just want to use a single page for your content is to stack the ads. If you are only running a single ad per page at the top of the content, it scrolls out of site likely below the fold anyway, so, why not stack another ad underneath it?
And doing the math to determine which is better (stacked versus pagination) would be very interesting and important. Probably something you'd want to continually test--especially on a large site. If it is true that users fall off rather significantly from page 1 to 2 to 3 etc. then it may well be better to stack ads. However, in the spirit of building a long term relationship with your ad providers you'd need to charge less for position 2, 3, 4, 5, etc. I am thinking a bidding situation would be the most powerful but maybe I've been doing to too many AdWords campaigns for too long. ;-) LOL
Nice feedback!
Brent D. Payne
Hey, that gives me an idea for a new metric (please correct me if I just crawled from under a rock and it is not "new") - instead of pageviews, let's count pagestacks. An ad, or a group of ads at the top of a page is only effective while a visitor is viewing above the fold. By stacking ads on each fold, we'd make sure advertisers get their imressions without artificially inflating the pageviews by splitting content. Visitors would appreciate a one-page-contetn as well, I'm sure.
I'd like to paraphrase something I've heard and say "Build for users, not for advertisers". If you annoy enough visitors with paginated pages, you'll lose them, which will drive pageviews down on the long run.
My suggestion would be to stack ads vertically on a page, one per screen view/fold - you avoid usability problem and keep ad impressions the same.
Having just scrolled down one of the longer pages in the web universe, I wonder: Why can't we keep long pages?
I like long articles on single web pages. It is simpler to write the web page and simpler to read.
Maybe it depends on the audience.
Scrolling down this page was fun, reading as I go. This is different from a long article, but I'd hate to have to click to a "new" page to see what the next comment was.
I like the idea of layers and well, fake pages; but once your reader starts clicking (..and waiting), wouldn't that increase the probability that the reader's mind will wander and move on to another site?
It is a way to get more page views, but it is against the Google webmaster guidelines.
I used this exact example for last year (with decidedly crappier looking images) to describe the guideline "Make pages for users, not search engines"
The ad view thing could also be seen as a "trick" another Google guideline no no.
As you know I don't really care if you call it gray, white or black hat - but the fact is, that the strategy you propose has a much higher risk of penalties than keeping all content on one page or do a traditional pagination.
So to me the question is not if this pose a higher risk - it does, thats a fact, but if you are willing to take that risk and if taking that risk is A) Necessary and B) The best solution.
Ask the client: Can you live with being removed from search engines in 1, 3, 6 or 12 month? You will probably get back if we fix the issues once busted, but can your company survive if all search traffic suddenly drops out?
If the answer is no, then you know what risk profile to pick :)
Mikkel - first off, great to see you here :)
Second - is it really something the engines will definitely penalize? This feels to me like a case where the experience for engines and users is the same, and you're simply using CSS layers to provide a better user experience, particularly if we're talking about 12,000 word articles where pagination really is critical...
Well when you are not dealing with an extensive (12,000 words) article layers are probably not the best way to go.
Instead of using an OnClick attribute for the page numbers and display attribute for the content try using CSS to make the ad block follow as you read through the article and give the content blocks an OnFocus attribute to refresh ads.
This way you have no undisplayed content to worry about and you should be able to meet both your impression and usability issues. One caveat is that I don't know peoples reactions to content that scrolls with them -- it may actually increase the attention paid to ads, it may just annoy them.
I hate clicking through multiple page views.
I also notice that the sites that do this the most, are usually those that are the most stuffed with advertising.
I really like the sites that offer a "show the whole freakin article on one page" option. They get me back again.
I don't disagree with you often Rand but...
Using javascript to split a full page of content into readable sections probably won't reload the page in the browser.
Assuming you're talking about tabs or pagination links, and everything is loaded into the browser at the start (check the source code if you're not sure) then the javascript only controls a hide/display style element. If javascript is disabled then the entire block of content should be displayed - if you're designing with usability in mind this is important.
Unless the page 'reloads' to create a new browser impression, the publisher will actually get 1 page view worth of revenue.
---
Also splitting the page into 3 distict sections may give the SEO opportunity to uniquely target to keywords. But at the expense of a dilluting the inbound links.
Just read the post again and noticed Rand was possibly talking more about CPM ads so I may be wrong about this in repect to banner ads.
Hi Rand, while this is my first comment on SEOmoz.com, I have been a loyal fan for so long now. I just wanted to point out some important factors specifically for News Sites:
1- From a News Optimization perspective, you have to paginate long stories. Google News for example doesn't index any long articles; you can check "article too long error" in the Google News Publisher Help. From my personal experience most of the stories having more than 700 - 800 words don't get indexed by Google News. "There are some exceptions for authority sites"
2- While many sites use CSS layers "tabbed content" to hide/display content on the same page, this is not a good practice for large sites specifically News Sites where they publish hundreds of stories each day. The excessive amount of layered (hidden) content could raise suspicions and get a site flagged.
3- The best approach for long articles on big sites is to paginate articles in multiple pages, where each page has its own unique story headline (page title) and to provide text links to all parts of the story on each page.
4- Google News Crawlers are more restrictive than the normal GoogleBot. i.e. while one long article (over 800 words) may have a better chance in rankings in Google Web, a paginated article will have a better chance of being indexed in Google News.
1. Has there been any decent user studies that gives us a definitive on whether or not users are bothered by having to click on page 2, 3, 4 etc.? Keep in mind what annoys us may not annoy the rest of the world. A/B or multivariant testing on a site should be able to quickly give some usable feedback. I think there are several good user experience reasons to split the page up, the least of which are page load times and the shear dauntingness of a very long article versus a nice neat article with some pagination links.
2. As for which approach is best . . . I think Patio11 made some really good points. The email a friend option (as well as RSS, bookmarking services, etc.) pointing to the lead page is a no brainer. The funnelling of page juice on the subsequent pages back to the lead article page is also great advice. I'd like more information on the table of contents aspect though as I feel it may create some bot confusion regarding which page on the site is the most authoritive for what. Patio11 can you expound?
Now here are some things that I would do that haven't been mentioned . . .
A. What about a conditional 301 redirect? I haven't had a change to use one so not sure of which color of SEO it would be (if anyone knows please do share). What do I mean regarding conditional 301 redirect? Well, if the referring URL is coming from an external source, then do a 301 redirect to the lead page (page 1). If it is coming from an internal page . . . don't 301 redirect.
B. User generated tagging (see PowerReviews implementation but without the java script, they have an XML version if you beg long enough). Allow users to tag what the article is about and have those links go to authoritive landing pages with a well SEO'd page flowing link juice masterfully from the landing page back to the articles you are trying to gain the most link juice on. This goes a bit beyond the pagination issue but would still help to create the first page as the most authoritive by having link flow up to the landing page and then back down only to the first page. Have enough internal links from several 'Obama' tags to the landing page to pool link juice and then funnel it back to the first page and you are doing quite a bit to funnel more juice to the first page versus the paginated pages.
C. Syndicate via XML with some really juicy anchor text. Feed only a synopsis of the article (i.e. first paragraph of the news article, or if you have the resources, write a unique paragraph to syndicate). If you can get your partners to embed the XML into their site or blog . . . you will have some great juice flowing in to the site and pointing to the first page of the article versus the subsequent pages.
Thoughts??
Brent D. Payne
Brent - my experiences in web analytics suggests that our users find split content useful only if the content is asking to being 'sectioned' (like Britney :)
A help page, or product specs, can benefit by being split via tabs, but splitting a page of engaging copy (such as an article) sees a diminishing amount of users visiting all the pages. Basically I've learned that content should only be split if the sections are self-contained with their own call-to-action.
Obviously War and Peace will need paginating, but that's a different kettle of fish to web journalism :)
Very useful and very direct feedback. Thanks for that. I think it would be good to look on a per site basis but having general knowledge of what to expect or which way to move towards initially is very helpful.
Brent D. Payne
On the topic of usability and pagination, it seems like the early (pre-2000) opinion was for splitting up large content chunks while the post-2000 view has been against pagination. Part of it's a bandwidth issue (long pages load a lot more quickly than the used to), but part of it is a change in how we construct content.
Pre-2000, a long page was just a huge stream of text, and from a usability standpoint, unbroken text is generally a bad thing. Nowadays, we realize that content can be organized and broken up in all sorts of ways; with headers, subheaders, lists, quotes, images, etc. That chunking of content is really the key, and now that we do it better, the benefits of pagination have all but disappeared.
I think pagination only works for end-users when a very large piece (like an eBook or extensive how-to) is broken up logically by topic area, with a good table of contents. Just picking an arbitrary length and slapping links to "1 | 2 | 3 | 4" at the bottom has no rhyme nor reason.
What? You even think that users could find split content useful? What about saving the content, will you go through 10 pages just to have one article saved on your desktop? And even the links wouldn't work after that.
I am really surprised how people here could argue for splitting articles. This is an absolute nonsense. It is not usable anyhow, get over it.
I can see the good and bad in that. From a usability end, I hate having to go through a couple pages to finish an article. It does feel like they are trying to artificially inflate their pv's. From an SEO end, if you are not a larger site then only the first 100k of your page is going to get crawled. So if you have a longer article it is good to split it up to get all that contnet indexed.
Wow, what a thread of comments we have here! Looks like Rand has bowed out of commenting on comments. :-)
Anyone here ever read SEO Chat? They split all of their articles into subpages with unique subtitles as the link to the next page. And let me tell you that it is THE most annoying thing from a UI standpoint. But, they get to re-serve fresh ads on each page (and they have a ton of them) and they use no follows quite well along with a very minimalistic "Article Index" that links back to each page of the article.
So from a CPM standpoint (this is the point of Rand's post) they are maximizing ad revenue. From a UI standpoint it is annoying but guess what, if the content is good enough you will still keep coming back. And from an SEO standpoint, if your site is well optimized, and your content is compelling, you will not have issues ranking or driving traffic to the site's main article pages (page 1).
Therefore, you are splitting hairs and trying to overcompensate for (probably) a weak site/content by feeling that you will lose longtail opportunities by spliting the pages up. You should focuse on good content and linking to drive the traffic and split your pages up if you are basing performance off of CPM.
Now if Rand's idea of using CSS and Javascript really does work and does not offend Google then we have all found a pleasant middle ground between SEO, CPM, and UI. And that my friends would just be fabulous!
This also has a tie-in to the long sales letter converting better which speaks to EGOL's observation that "most visitors only view one page". The medium however definitely comes into play.
For a client like NPR, you'd likely go with pagination {sarcasm}or write every article with a bent towards purchasing something from the NPR store{/sarcasm}
The main problem with the CSS + JS trick is that it's clearly a fraud from the advertiser's perspective. What's even worse, he can easily notice it in his logs.
He'll see repeatedly that one single visitor on one single URL somehow generates 3-4 ad impressions. He explores the issue, finds out about the trick in about a minute and you're done
I agree with you Mr Rand only in case of a story with multiple sections, but not n case of category with multiple stories...
of course a story=article=single news not an entire book.
This one is practial, thanks, Rand. Reading your blog for years. Just like esiod.com
Using CSS and DHTML layers won't get the publisher more pageviews - most analytics packages won't track layer show/hide.
That's why a lot of publishers are going back to the individual URLs.
I wonder if the SEOMoz system could be upgraded to include people's comments in the email that is sent out once we've left a comment. As it is now, we have to come back to the web site to read comments, bumping up page views again, although when I arrive here I am directly at a comment, so I never see the ad at the top of the page.
This hits close to home. i have exactly these issues, a demand for PVs and better rankings for an article heavy site. Other issues, not touched here but touched elsewhere on SEOMOZ, such as the use of a Content Management System, also factor in to how the content is ultimately rendered on the site.
The above example is an interesting one, and one I'll be thinking about as an alternative.
Thanks!
As for usability I turn to our squeeky wheels.
When our pagination was removed no one wrote in complaining that articles were difficult to read, confusing, or just plain annoying.
Adding pagination produced enough emails complaining about the "feature".
So I would suggest not going through the trouble of paginating long news content. With proper sub-headings the readability should be absolutely fine and people don't have to interrupt their train of thought to click and potentially wait for the rest of the content to show up.
The trade-off in CPM to retain readers rather than them finding a competitors website that doesn't annoy them is worth it.
Just have auto-refresh to bump up your impressions.....
I'm not actually suggesting that, as it's not exactly the most open way to look stats - but I'm sure a lot of companies use it...
We have solved the usability problem by using pagination logically and offering a downloadable PDF version of the article on each page.
The article is a long (and I mean long) dry academic paper and 5 months later we are still averaging over 2 downloads a day.
Informative article, great visuals! Thanks Rand
IHT tried something similar to this, but I noticed that they relegated it to an opional feature. My guess from that is that it was a usability or acceptance issue.
[edit] I doubt they tried this for SEO reasons BTW
Excellent topic...This is a procedure I will defintely have to pursue. I would be interested to know how the Search Engines will handle the process? I would agree that is better for users, but this doesn't mean that the search engines agree?