Despite being a seemingly simple topic, this one seems to stymie even experienced SEOs. There's a natural conflict that creates the issue - the more keywords you target on a single page, the less you need to link build and optimize (for both search engines and user experience/conversion rate) on many pages.
To answer this question in a logical and truly optimal fashion, you need to start with the answer to two other important questions:
- How many of these keywords carry the same visitor intent?
- How competitive are the targeted terms/phrases?
When you answer the first question, you'll be able to break up lists of keyword terms into buckets of "intent." Searches are almost always intended to discover information or take action. If there are too many pieces of information/actions you need to provide on a single page, your conversion will drop. Remember that a 10% conversion rate for position #10 is better than a 0.5% conversion rate for position #1 (assuming the avgs. from the leaked AOL data cited below).
NOTE: This data is from averages via AOL's data release in 2007. New numbers have not been forthcoming from any of the engines or third-party studies.
For the second question, you need to know something about the competition levels. In a scenario where every shred of keyword usage matters a great deal, from the anchor text focus to the keyword being employed at the very start of the title tag, breaking up keyword targeting to multiple pages can make a great deal of sense. If you're deep into research on this topic, you can do something like the image below, where I've taken stats and metrics for all of the top 25 ranking pages for the query "broadway tickets" on Google.com and run analysis:
NOTE: data in this graph via Open Site Explorer's Backlink Analysis
If a keyword is highly competitive, I suggest single page targeting. This is not only because you can maximize on-page optimization, but also because it means that internal and external links that point to the page can focus more directly on the target term/phrase. It's also likely that you'll be competing against pages that are more highly targeted on that keyword phrase and could lose out if you don't have that singular, pinpoint focus.
I wrote another post on a similar topic highlighting how to format titles, meta descriptions and keyword usage on pages that aim for multi-keyword targeting that may also be of help.
Look forward to your thoughts on the topic.
Given the fact that AOL data for SERP click-through percentages has been knocking around on t'internet for a few years now, is this something SEOmoz might be looking into researching in the near future? The question of "how many people click through on each link" is one I am often asked as an SEO, and this old data is pretty much all I have to give them.
I know that's not 100% on-topic here but I feel this set of numbers is something we all widely use but we don't really have any idea if it is accurate especially when used in further analysis like this.
I agree with Bludge. However much I hate the AOL data (1st is 1st and 2nd is nowhere), it is useful to know, but we have no context in which to judge it.
If SEOmoz could carry out its own testing this would be helpful as you could provide more detailed information on howclick through rates might vary for different industries or different types of search.
The trouble is there's too many factors affecting the accuracy of a study like that now; what with personalisation, geolocation and blended/universal results.
To give you a rough indication, one of my sites is top with one-line sitelinks for its 'trophy keyphrase', and Webmaster Tools reports a CTR of 21%.
a thought, i recently installed vuze for p2p file transfer. One of the things it does is install a firefox toolbar etc that serves its own version of serps withtheir ads on top. They would have a very good idea of this data.
Anyone doing something similar would be able to give very good data.
I suggest SEOmoz make one and share the data with the people who use it ^^
Yeah - we've been thinking hard about this. It may be something where we ask permission from users of our toolbar (about 5K) in the months/years to come and anonymize the queries, but keep some info like phrase length, number of words, position clicked, etc. The big issue is getting a large enough sample size to be relevant.
The trouble with toolbar users is that it's not a representative sample of Internet users. People with the SEOmoz toolbar are likely to be more savvy searchers. In an ideal situation (and I appreciate this is outside of the SEOmoz realm) then it would warrent an academic study.
So you can have *ahem* "ordinary" Internet users given a mock up of a Google results page with a fixed set of results displayed in a random order, and they can be asked which one they'd click on. Record the results in a little program, bish-bash-bosh job's a good'un.
Actually, I could probably get myself a PhD out of that (although it'd need refining from the 2 minutes thought I just gave it!)
I'd love to see this too. We have a rough estimate chart on a whiteboard in the office and make use of it for some reporting, but we're constantly telling clients that the breakdown has likely changed, and of course we can't make up for universal search.
This said, some things usually attract news results, images, videos, etc. Click-through stats would now have to come in a group of multiple lists. One for "news appears in the top three", one for "images appear in the top three", etc. If you want to rank for "crossfit training", you're always likely to see video results*. Thus, that list would apply to your work.
It would turn into a far larger piece of work than a numbered list of percentages, but it would give us more to write on the whiteboard.
*in this one example, video results won't load. That's the nature of trying to provide live examples with computers :)
Unfortunately, this is getting harder and harder to measure. The average SERP display is very different than it was a few years ago, with vertical creep impacting a huge percentage of results pages (whether that be Google Maps, Images, News, Twitter, etc.).
The fact that the AOL Data was before most of this actually may give us a clearer picture of what numbers would like in an ideal situation.
Even if there were another huge data leak, we wouldn't be able to discern whether the user who clicked on the first or second listing saw a Map or Product Feed in the search results or not, which would greatly impact any assumptions we would try to make based on that data.
Personally, what I tend to do is to privilege one keyword for page... but
trying to naturally use SEO Copy and On Page optimization in order to have that page ranking for the semantics derivation of the main keyword, which - finally - working on the middle/long tail of the main keyword of the page.
This imply obviously the link building of that page, but to have a semantic pattern helps me in order to diversify the anchor text of the links.
For instance (translating from the italian, which is the language of my site), the 1st 4 words of my homepage title are "Web Marketing SEO Services" [aka> Servizi Web Marketing SEO].
With this 4 words (followed by a claim about Small business) give me the opportunity to focus the page optimization on:
"Servizi Web Marketing SEO" (a long tail)
"Servizi Web Marketing" + "Servizi SEO" (middle tail)
"Web Marketing" + "SEO" (head of the tail).
My website is quite new and doesn't have so many ext links (working on it ;)), but actually this tecnique is making my site ranking 1st and 2nd for middle and long tail terms.
Thanks for the thoughts Rand.
In very competitive niches many SEOs seem not only to split kewords up on single pages, but to split the almost exact same keyword on multiple pages.
We just bought a site about nutrition supplements. It has so many articles, that splitting up keywords on single pages become a very inefficient strategy. For example, it has the following pages:
domain.com/proteinpowder
domain.com/protein-powder
domain.com/protienpowder
domain.com/protien-powder
Not only is it pouring the link juice in too many cups, but it becomes very spammy. So your post is great for a day dedicated to 301 lots of content.
I guess the point I am trying to make is, that splitting up keywords on single pages is great, as long as it's not overdone.
thogenhaven.....thanks for posting your protein powder example. i need to make some changes on one of my sites after seeing that!
Tony ;~p
Starting from where you are (those 4 pages) you have to juggle the benefits of conentrating link power onto one page with the cost of losing trffic and response for all the long-tail and related keywords on the pages you are about ot 'remove'. Assuming they have text.
Yeah - in the case you're describing, the intent clearly matches, and the singular/plural is something where the engines have been moving more and more towards recognizing them as the same query (in most instances). Thus, splitting up link juice/targeting/content between those pages wouldn't make much sense at all.
It can get much more complex for homepages targeting 2-3 separate phrases that "could" be considered to have the same intent. Or for something like "architectural supplies" and "architect supply" where the phrases are most likely the same intent, but the targeting really does require different keyword usage/link anchor text.
I really like your previous post on combining similar intent driven keywords onto a single page (like the flow chart), I think that strategy can make a lot of sense: https://www.seomoz.org/blog/headsmacking-tip-4-use-keyword-variations-with-matching-intent-together
And your reference to the leaked AOL search data (think it was 2006 if I'm not mistaken) is interesting, I actually was looking at that too recently, a few weeks back and created a sort of click estimation tool... but have yet to find out what the ratio clicks to searches performed might be, like when you have 100 searches performed how many total clicks (distributed over all positions) result on average, 90, 50?
I've seen splitting keywords like schecter hellraiser across multiple pages can sometimes be beneficial because of duplicate listings in the SERP. I can't always reproduce that however but it is cool when it happens.
Nice article rand. I try to focus on singular important keywords :)
I go for the more singular keyword approach. Keeping it focused and work on the link building. Even though the page it self doesn't have many links, it will rank well if the domain is strong. Just look at number 11 and 23 in the analysis.
For me it is simple. I structure the webpage based on usability and conversion. SEO comes second. SEO is very important, but can never be as important as converting, because traffic is worthless if you don't convert it.
When looking at the table it is interesting to see number 19 - with domain strength only 19, , while others are around 90. That answers a discussion that was here a few days ago. So if there is one keyword that is more important than any other for your website, you know where to put it.
I think the analysis shows it pretty clearly, that the singular keyword approach beats the other one by far. The only argument left for multiple keyword targeting is laziness - and that is unlikely to get you far ;)
Had this issue with a past website. We were number one in the industry, but new competitors started targeting singular keywords for the industry.
We went down the page of single keyword focus, but discovered difficulties in link building to all three pages with the resources we had. Indeed, our main page had very strong page authority and out ranked our singular keyword focus. Accordingly, working in what was a short term focused environment, it was a strategy we didn't have a lot of joy with and dropped.
I think the earlier you deploy a singular keyword focus the easier.
I work an an industry (general Law) where I have to target hundreds of keyword phrases and this was one of the most difficult decisions I had to make.
I chose to target a select number of important phrases on my index page and link build for that, and then set up many pages for individual topics. I find this the tough part; particularly trying to build deep links to these important inner pages.
It seems to be working, but it needs constant work to try and get so many different pages ranked highly.
Thanks for all your invaluable advice. You have saved me from following many a wring path.
I still believe that target singular keyword will be more effective than target multiple keywords on a single page. Reason:
1. You can have an optimised title for each keyword
2. When establish internal linking, search engines are more likely to understand better. (links spread out to different topic pages, rather just focus on one)
3. More content (easier to implement more long-tail keywords)
4. Good for user experience, content will stick to the topic.
5. Keep SEOers and copywriters working hard :D
This is my $0.02
But, on the other hand, a page that has a few thousand links from a few hundred domains has a much better opportunity to rank for a number of phrases than a page with only inbound links (and tweaking keyword targeting is tremendously easier to do than massive external link campaigns).
Not saying you're wrong, just pointing out why the challenge/dichotomy exists.
I've always had you down as an exemplar of diplomacy (I'm sure Will Critchlow would agree...), but now I'm starting to feel it's something more. Could SEOmoz's move into software be a cover, nay, a conflict-of-interest-avoiding-necessity for Rand's rise to the white house?
President Randfish .... Sounds good. I'd bag it in the SERPs early.
I've already got the domain. Just waiting for Rand to call...
or a reasonable facsimile thereof?
Nah... I was just 'joshin easysafety. If I buy one more domain on spec' my wife will have a cow ;)
I've been sitting on mywifehascow.org for a while, I'll do you a moz mate rate on it :)
Yeah, Rand, I understand your point.
It's all down to competition, relevance and some other factors such as type of websites, isn't it.
Good point Rand. I also target the main page for the most competitive keywords. But i try to target the other pages for less competitive ones
As usual, this post comes very timely as I have a client demanding to target multiple competitive terms on his homepage where the search intent does not match between the keywords!
Usually I find that I can target a main keyword on the home page along with 1 to 2 more highly searched modifications of the same keyword and end up ranking well for all 3 of them. Then the minor keywords I target with individual pages.
The table of data from OSE is awesome, but I assume you had to lookup each url to get that data and then consolidate in Excel? Correct?
It would be a great feature to be able to enter a keyword phrase and have it spit out the table you created. Am I just missing this ability or does it not exist? If not, any talented developers willing to take a stab at this feature with the OSE APIs?
That's a good point. Would folks here be interested in a tool on SEOmoz that lets you select and pull in lots of columns of data like this to build your own competitive analysis?
Yes, please!! That would be great!
Let me think for a sec.....YEAH!!!
I doubt anyone would be against a tool like that.... :D
After all, we're always looking for ways to make our jobs easier!
Damnit Rand, always taking what we do internally and then making it free and better. Ruins all the fun.
:-)
For sure!
Certo!
¡Está claro que sí!
Yes rand yes!! :)
I love these short tactical guides that bridge some of the heavy strategic stuff you guys produce to the everyday tools that most of us are familiar with. It's especially neat to see what columns you use in your analyses and how your sort them, etc. Good post!
Do you do this level of analysis for sites with existing traffic? It's all theory and guesswork. Great for new sites until they have real traffic.
PPC can tell if the traffic suggested by tools is real and if it converts on your site.
Only real organic results will tell you how your site fairs against the competition on SERPs.
Once you have organic traffic and response rates, that's the first place to look for target keywords - your own stats. Build on proven success.Also, reality is that if you target one keyword you get results for 1,000, sometimes 10,000.
And you might target one keyword and get no results for it but 1000s of visits for similar keywords or keywords containing your target keyword.Above points make analysis the single keyword a bit silly. Theoretical one might say.
Think groups of related keywords. Think keyword niches. As I wrote recently, 'single keywords are for losers' (search with phrase if interested).
I've found that, in certain cases, getting a good ranking for a tough phrase on the home page (which is targeting more than one phrase) and then switching the efforts to getting a deeper page to rank well instead has paid off.
It is generally easier to get that good ranking on a strong HP, which gets the relevant, deeper pages on the site more exposure. Also, I've seen major category pages rise up when it was the HP getting all the good anchor text links - the engines seem to work out that people can be lazy when they link and give the credit to the most relevant page on the site, rather than the HP.That said, I only work seriously on one site, so I don't exactly have a significant data set!`
I agree. Most of the times, i try to build links to the main page for the most competitive keywords. After the website gained some mozTrust, PR, better traffic etc. , I start to build links to the deeper pages for their specific terms. It does indeed pay off :)
Keyword targetting aside, wow that spreadsheet export data from OSE is awesome (drools). This tool is becoming less of a 'nice to have' and more of a 'need to have'.
And, unlike some tools I could mention, it's actually nice to have to need it!
The statement "a 10% conversion rate for position #10 is better than a 0.5% conversion rate for position #1" is a bit of an over-simplification isn't it? For any one keyword term, I would prefer to be in the number 1 spot figuring out how to improve my conversion rate than #10 trying to figure out how to improve my ranking.
I imagine this is just a point in need of clarification as you refer to the possibility of ranking for multiple keyword terms with multiple targeted pages. In this case, #10 spot for multiple keywords each with 10% conversion rates is most certainly better than one "general" page at #1 with 0.5% conversion rate.
A small issue but one that I thought worth pointing out in case too much is read into this statement and people lose sight of the bigger picture.
Well, that depends. I'd actually rather have the page converting at 10% because, at least for me, SEO is the much easier side of the equation (and a page converting that high must be doing some awesome things right and can likely earn real buzz around it).
Interesting point and likely why sometimes we have disagreements with new clients because they have followed the many links 1 page model.
I would assumed that it assists if the multiple links are themed, so a page about SoftDrinks would have backlinks like Coke,Fanta,Lift,7Up,Soda Drinks...
hi..friends I am here for helpmy one site on which I am working is in trouble nowits ranking is continuously fluctuating...sometimes it comes into top 10 of Google and some times it gone down up to 500.....what should I do so it comes back to top 10 and become stable...should I stop directory submission for some times??Please reply...
Thanks for the post. For me it should always be a midpoint between targeting multiple and focusing on single keywords.
Though this obviously isn't a do-or-die rule for me, there are some cases where I think a landing page purely for an individual topic is useful.
Thanks again.
Hi
I just started a web site and it's persian . So the words that I'm competing for , are persian (not a big competition) and our job is web designing , portals , CRM and etc.
I decided to translate articles about SEO and other stuffs that relates to our business .
The question is that should I place portal,web design or SEO(in persian) as a keyword in every article that I'll publish about them ?
And if the article is about multiple topics , can I place multiple keywords in that article ?
Thanks .
Hi, I am doing some research and am looking for a few suggestions. How do you feel about a site that is ranking for under 10 keywords in the top 10, and fairly new? Would you suggest a blog strategy that has a blog that targets one keyword at a time, or should we attempt to keyword for more than one keyword at a time in hopes we will get more traffic? For example we are looking at ranking for website and internet marketing on one blog, however I feel like these are too competitive and will reduce our traffic possibilities. Any insight??????? We have a heated debate at our office :)
Hi,
Could do with some help, still a newbie.
I have a skin care site and the keywords i want to use are:
How to have clear skin
how to have clear skin fast
how to have clear skin overnight
They all have reasonable searches to justify covering them but how do i do it ?
All on one page with different H2 for each keyword, or make separate peges?
Would appreciate a response from anyone.
Thanks
Regarding that AOL data on search volume:
Universal search can really affect the percentage of clicks each organic ranking gets.
We recently moved a client from position 3 to 2 on Google for a highly competitive phrase and saw a 200% increase in traffic.
I about fell out of my chair because I'd never seen such an increase before. It certainly doesn't mesh with the leaked AOL numbers.
The kicker was that the Google Merchant Center listings were in the real #3 spot, creating a physical break between the top 2 organic listings and the rest of the page.
With Google pulling ever more map results, YouTube videos, and images into search results these AOL numbers become less and less relevant.
FYI!
In many situations especially in online shopping field , It's much better targeting many keywords in different pages, I think.
One of the shortcomings with targeting a single term per page is that it creates an environment conducive to over-optimization. If you are even in a semi-competitive space and rely on some form of paid linking, you run serious risk of acquiring too many same-anchor links. If you then have to acquire other paid or non-paid links to anchor-text-launder, you might as well target another term or two.
That being said, I still believe in creating content for every primary term. However, if you are going to use paid links, you may as well employ a multi-term strategy.
Good Post Yet Comments are much confusing here. Some of them have targeted Multiple some have target single keyword per page. Multiple keywords is a good way for less competitive key phrases or keywords.
I loved the post and comments both.
I did love to choose singular keywords and those multiple ones with a heavy search volume.
I'm definitely on the camp of singular keyword focus for each page. I guess this is because most of the websites I have worked on are SME sites which are quite niche. So it helps me to target synonymous words for different pages on the website. But the most important word goes to the index page.
However, if I was working on an e-commerce site, I would see the importance of categorizing and subcategorizing pages to provide good landing pages for these categories, that way you get to eat your cake and have it, like Rand has said.
Well said.
Nice Post… One should always keep this in his mind while doing SEO of any website. However, what if my website is about web design/development and SEO services, what should I target at home page? Would it be a good idea to target only web design on home page and other keywords on internal pages?
I would use the homepage to tell the customer who/what you are and what you do.
Then promote the internal 'product' pages with on/off page optimisation related directly to the service they describe.
Rather than simply trying to push higher indexation of the homepage, promote the internal ones where it is easier to make keyword specific.
It really depends - if you're targeting a particular geography, I'd likely include all three on the homepage if the competition is relatively low. If it's very high (New York, San Francsico, etc.) then I'd be more likely to choose your primary focus for the home page and target the others on internal pages.
Rand but let's say the competition is very high
but targeting keywords are very close(nearly synonyms in case of location)?
My client site targeting "Manhattan Movers" and "Movers NYC"
it' not easy to decide what is BEST practice in this case...
------------------------------------------
I thinking targeting Manhattan Movers and Movers NYC to home page would be strategically better for future SEO with current tendency to localization, mobile search and Google instant people will be searching for "Keyword", But not for "Keyword" + "Location"
Oh you answered my question too .
Yeah in my website the competition is quiet low so I should place our services in homepage .
It's been a month that I found you're article's and I'm reading them 4 hour's a day.
Thanks for you're help.
W
I think that this issue may be related to the following research: https://www.seomoz.org/ugc/using-anchor-links-to-make-google-ignore-the-first-link . It may appear that having one page optimized for various phrases and linked using hashes may be higher in Google than many pages optimized for one phrase each.
I've always thought that the best approach was to keep keywords per page to a narrowly limited scope. I understand the need to have multiple keywords, especially on home pages, but for best search results, not to mention conversion results, narrow focus is better than broad focus.
There are no simple topics in SEO. Nor are they anywhere else :)
Great post.
Sometimes its hard to figure out whether do separate pages or all keywords on the same page.
Research can sometimes lead you to that middle point where you can go either way.
I feel like i hit this middle point too often.
I know you have mentioned if the keyword isnt very competitive then all do it on a single page, but often i do the opposite.
Sometimes because the keyword isnt very competitive i create a new page for it, and therefore your linking campaign doesnt have to be so intense, because its easy.
And sometimes the keywords are competitive and i bring them all to the front page to reach maximum linking ability.
I am working on eCommerce website. Right now, I am facing similar situation for my categroy page. [Leather Office Chairs]
I am approching to get top 3 ranking for Leather Office Chair & Leather Office Chair keyword. After reading this post I had developed one list of keyword as follow.
All keywords have good exact search volume over Google. I am little bit confuse with competition of keyword & using allintitle in Google. So, Is it ok to get idea or not?
Let's try this tactics & hope will get positive result. BTW: Thanks for your valuable post!
You must be using "broad" instead of "exact" in google keyword tool that's why you are getting same search volume for each phrase.
Do not try to use all phrases exactly as it is on pages. Instead, focus on main key phrase in title and use seperate words in body "leather", "chairs" etc.
The biggest problem when looking at this is the industry that the keyword falls into. I think having all your eggs in one basket is a bad strategy.
I have a client that wants to be top of Sweets, and this is a task that cannot be done easily.
The keyword retro sweets is a lot easier to go for so I recommended this to my client and then once there, attack the bigger keywords.
I have to add one of the topic comment thou, working with a retro sweet shop is ace, free sweets.....
-- Link removed - Jen
A lot of this will come down to site acrhitecture and making sure that categories are given their own pages and then optimised for the targeted keywords.
There was a previous blog post about this but I couldnt find the one I was after, and instead found a great WBF vid that explains some of it.
It also shows Rand with a crazy hair thing going on! Good Times!
https://www.seomoz.org/blog/whiteboard-friday-content-categorization-for-seo
What's a mozzer to do? This is either really good spam or a poorly constructed comment, and I can't tell which! It's like the phishers that are getting so good at what they do.
Should I thumb down for being spam? Should I thumb up for being really well constructed spam? Should I do neither in case it's an honest (but clueless) commenter?
I'm going to believe the best of chameleonwebservices and let it go.