When we do keyword research, we tend to focus on discovery. We take a short list of keywords we think matter, brainstorm wildly, paste the resulting list into a dozen tools, paste the results back into Excel, and measure our success by how often our spreadsheet crashes. Then, we throw it all away when our tax attorney client tells us he only cares about ranking #1 for "taylor swift downloads," because he heard that gets a lot of traffic.
Maybe I'm exaggerating. Keyword discovery is a critical process, but what we’re left with at the end is a long and often rambling list to prioritize, and typically we prioritize either by our own gut feelings or by the black box of AdWords global volume. What if there were a better way?
When we were building Keyword Explorer, we wanted to solve the deeper problem — how do we pick the best keywords to start with, given the complexity of Google SERPs and our competition in modern SEO? Which keywords really balance potential traffic with ROI?
Over the course of many months, we created four metrics:
- Keyword Difficulty (V2)
- Organic CTR
- Importance (user-defined)
- Keyword Priority
Today, I want to explain the philosophy of these metrics, some of the math behind them, and how you can use these ideas to reinvent your approach to keyword research. Stepping outside of our product, I'm going to try and translate these metrics into questions that are relevant to anyone, regardless of which tools you use.
1. How difficult is the keyword to rank for?
All else being equal, we’d rather rank #1 on a keyword that gets a ton of traffic. In the real world, though, all else is rarely equal. High-volume keywords are often highly competitive, which translates directly into more ranking difficulty. More difficulty means more time and/or more money.
A few years ago, we developed a Keyword Difficulty metric based on our authority metrics (Domain Authority and Page Authority). Page Authority (PA) is a constantly evolving metric that is designed to correlate with a page’s ability to rank on Google, based in large part on the page’s link profile. Keyword Difficulty (V1) used Page Authority in the middle-top of SERPs (the median PA, more or less) for a given keyword to approximate how hard that keyword would be to rank on.
Our updated Keyword Difficulty (V2) score uses a more complex, click-through rate (CTR) weighted model of Page Authority across the first page of a SERP, reflecting more of the competitive landscape, and adapts to today’s irregular organic result counts. V2 also does a better job of filling in gaps when PA metrics are missing for one or more results. Finally, V2 scales scores to fill more of the 0–100 range and provide better granularity.
If you want to cheat and build your own proxy for Keyword Difficulty, I'm going to risk a few sales and let you in on a secret. Install the MozBar SEO toolbar (it's free), run your Google search, and grab the PA from result #4:
Why #4? That's a rough approximation of the median difficulty of the page, slightly adjusted for the prevalence of SERPs with less than 10 results. You can pull similar data (although in a couple more steps) from Open Site Explorer (also free). Again, it's a rough approximation, and Keyword Difficulty V2 adds quite a bit, but it's a start. You've got one more column in your spreadsheet.
2. How much organic opportunity is there?
The days of 10 blue links are gone, and today’s SERPs are a complicated mix of paid results, organic results, vertical results, and Knowledge Graph features. Traditional keyword research assumes a mythical page of nothing but blue links and 100% click-through potential.
Let's look at a modern SERP, a result for the brand "Forever 21" in Portland, Oregon:
While there are many potential opportunities on this page, the only traditional SEO opportunities that a company other than Forever 21 can realistically compete on are the three in green. The first position is naturally dominated by the brand, the two rows of site-links beneath it each occupy one row of organic results, there are two verticals (Twitter and News), a local pack, and the three results at the bottom are a pack of In-depth Articles and not subject to the same algorithm as core organic results. Plus, there's a Knowledge Panel on the right that has the potential to draw away clicks.
From a traditional SEO standpoint, there is very little organic opportunity available on this page. Our Organic CTR metric attempts to measure this phenomenon. It starts with the assumption of 100% CTR (that's idealized, of course, but it makes the scale go from 0–100), and then begins subtracting clicks based on non-organic features, including ads. Ads and Knowledge Graph features take away clicks and re-shift the CTR curve. Verticals occupy an organic position and remove the CTR of that position, etc. If the #1 position has site-links, we assume that position has "dominant intent" (to use Google's vernacular) and probably isn't in play.
The Organic CTR model gets complicated and it necessarily makes many assumptions, but the goal is to subtract out all of the non-organic elements and try to figure out what's left. We hope to enhance and evolve this model as time goes by and we gather more data about how features impact CTR.
You don't have to use Keyword Explorer or build your own, equally complicated model, but I think it's very important to look at SERPs in a browser as a real searcher and understand the available opportunity. Even if you do keyword research by hand, give that opportunity a score (1–5 is a good start). Some of the most attractive, highest-volume keywords also have the least available opportunity.
3. How important is the keyword to you?
We've all got stories about clueless clients, but their experience does matter and some keywords have more business relevance than others. The trick is not to let it become too subjective. Put a number against it. Make your client, boss, or team prioritize. Everything can't be a 10, so create a column and force them to pick a value. Quantify your intuition and put it to work.
In Keyword Explorer, we wanted to allow customers to adjust keywords up and down to reflect insights and intuitions our models may not have. Importance is essentially a multiplier, and it ranges from 1–10. We default Importance to a value of 3. That may seem like an unusual choice, but it allows you to easily shift a keyword by roughly a factor of 3 in either direction (1 = 1/3X, 9 = 3X). It also gives you a bit more granularity for upward adjustments than downward. Our assumption is that most people will tend to focus importance adjustments on critical keywords that are essential to their business.
4. Which keywords have the most potential?
So now you've got a mountain of data, which is great in theory but a bit overwhelming for many of us in reality. We thought it was important that people have the raw metrics — many of you are advanced SEOs and want to slice-and-dice the data into your own models. However, we also thought it was important to provide guidance and help make sense of it all. Perhaps the toughest question at the end of the keyword research process is "Where do I start?"
Counting keyword Volume, we have the following metrics (or variables):
- Volume (V)
- Keyword Difficulty (KD)
- Organic CTR (OC)
- Importance (I)
Keyword Priority (KP) combines all of these metrics, creating a weighted score (also 0–100). Volume is a real number that carries meaning by itself. You can think of both Keyword Difficulty and Organic CTR as multipliers. Higher Difficulty reduces Priority, while higher Organic CTR increases Priority. Likewise, Importance is a direct multiplier. Our formula for Keyword Priority looks something like this:
KP = sqrt(V) * (1 - KD / 100) * (OC / 100) * I
We use the square root of Volume so that high-volume keywords don't automatically overwhelm all of the other metrics, but very high-volume keywords still naturally carry a lot of potential. The resulting value is re-scaled in Keyword Explorer to a 0–100 score, but that math gets a bit tricky and is somewhat unique to our own internal metrics and the ranges of data we historically encounter.
Even if you do keyword research by hand or in a semi-automated fashion, there are many ways to adapt this basic concept and create a useful meta-metric. Obviously, one metric can never convey all of the complexity of rich data, but it's important to remember that this is not an either/or situation. You can use a meta-metric for sorting and prioritization, while still keeping all of its original components for deeper insights.
Richer metrics for a richer world
No metrics are perfect, but the Google landscape is richer and more complex than ever, and it's important that our metrics and tactics evolve as SERPs evolve. Whether or not you use Keyword Explorer, keyword research is still a fundamental building block of a strong SEO campaign, and that research has to reflect modern SEO realities. Understanding the interplay of volume, difficulty, organic CTR, and your own intuition of importance is an important next step, and those concepts extend far beyond any single product. If you end up adapting any of these ideas, I'd love to hear about it. Extra credit for Excel spreadsheets, especially ones that crash my computer.
Firstly thanks to Moz team for providing many useful tools for free, this brings more trust on them.
Well said Peter, the game is to understand the interconnection between K4. K: volume, difficulty, opportunity, potential.
Hi Dr. Pete - thanks for the detailed breakdown. Very happy to see keyword importance in the equation. Too many times I see people not including "important" words in their set due to the keyword difficulty score. Personally, I do think a bit to much weight is put on that difficulty variable - for example, how do you tell IBM or HP that "Cloud Computing" should not be in their keyword set as it is insanely difficult. The problem for them is not a search problem but a go to market problem so the difficult score enables the continuation of the wrong content or their vocabulary or links to campaign microsites so gives a false negative.
Question - can you replace the Forever 21 example with another non-brand example or a multi owner brand like Century 21. Not sure who else other than a review site would deserve to rank for Forever 21. A good example of a brand might be Century 21 as it is both a department store on the east coast, a real estate company and a phase for the individual agents. That illustrates legitimate competitors for the phrase.
In my keyword process using my DataPrizm tool we always start with a list of words that represent what the company does, stem them and then go to search volume to expand them. I deal with millions of words across thousands of products so maybe it is more of an issue for larger companies than smaller. It is Interesting how so many companies have never made that first list and I think your "importance weighting step" helps users get back to that critical, yet often missing step in the process.
Thanks Bill - you bring up a good point that I've been thinking about, too. It almost feels like we need a customizable Difficulty score based on the domain you're targeting the keywords with - e.g. if IBM.com is going after a keyword set, the Difficulty should be lower (or have less weight) vs. if newdomainIjustmadeup.com is giong after it.
Re: Forever 21 - lots of local listing providers (like Yelp) and coupon sites (like RetailMeNot) and media sites (Jezebel, NewYorker) are targeting it :-)
Great minds.... I have just seen too many people paralyzed by KEI and Difficulty Metrics including 2 employees that I had to let go in the past because they hid behind difficulty measures. Most of the time the difficulty if the fault of the company an not the scoring metric. Your spot on that newdomainjustmadeup.com going after a phrase is hard for a variety of reasons but a company recently was avoiding going after a word that they were the 5th largest sales in the world for the product. They were referencing you keyword difficulty score as the reason why they could not rank for it - I showed them "why" they were not ranking and got them ranking in less than a month. That is just my point - heavily weighting that factor can lead to problems.
The insights on Forever 21 and the others that "deserve" it change the situation and makes it a good example - as I think in terms or searcher interest and typically only work with the brands the rest of the people who "want a phrase" is often noise to me so thanks for the enlightenment.
Hey Rand,
This is a long response I realize, but it's a real biggee for me and perhaps some of your other customers/users as well, so hope you are able to take a few minutes to review it.
Just getting to this post now, but this is exactly the way I've been going about doing keyword difficulty analysis for several years now.
After I've come up with my short-list of keywords that 'make the cut' based on all my other criteria (i.e. keyword relevance, search volume, various analytics data, etc.), I then run a keyword difficulty analysis on each keyword.
I essentially look at how many page-one organic results have an overall DA and PA (that I weight based on my own experience) that are significantly greater to, about equal to, and significantly less than the DA/PA of my client's site. (To compare the SERP PAs with my client's PA, I 'estimate' a reasonable client 'average' PA based on the existing PA of the client's homepage and inner pages).
If by my assessment the client site beats 7 or more of the page-one sites by a significant margin (based on the typical 10 organic results on the page), then I score my client's keyword difficulty for that keyword an 'A' (meaning they have an excellent chance of ranking on page one assuming they have - or are prepared to create - quality, relevant content). If client can beat 3-6 SERP results, a 'B'. 1-2, a 'C'. None of them, an 'X' (meaning little-to-no chance of a page one ranking).
All keywords with X's then get eliminated (at least until the client site's authority has grown significantly). For all the keywords with difficulty of A, B, or C, I then weigh my scores for all factors (I do the A-B-C scoring thing for relevance/importance, volume, and difficulty) for each keyword to arrive at an overall priority for each keyword, as well as entire keyword categories based on searcher intent.
I also look at other factors when viewing the SERP results (like number of EMDs, etc.) that may influence my kd assessment, but that's essentially the process.
But here's the real point of this comment: Doing this keyword difficulty analysis is an incredibly tedious and time-consuming process, and while it's difficult for me to prove its effectiveness (because so many other SEO factors are at play... technical, other on-page factors, etc.), I'm convinced that it has made a very significant difference to my client's overall results (because the few times I strayed from this manually-intensive process of comparing Moz' DA/PA page one metrics to my clients' DA/PA metrics, the overall client SEO results were nowhere near as good).
I've searched high and low for a more automated solution that takes an approach to keyword difficulty that's similar to what I've outlined above - hoping that someone out there (if not Moz themselves - hint-hint!) would create such a solution using the Moz API - and have been surprised that nothing like this exists (as least not that I'm aware of) as it seems (at least to me) like a logical approach to doing keyword difficulty analysis.
In fact, I've even considered working with a developer to create such a solution, but it's just not my thing and I don't have the time. While I'm not a developer, I believe I do know enough to know that it's certainly do-able and I would image it wouldn't be too complex for a reasonably skilled developer to create given the right guidance about what the resulting product should look like.
My thought is that such a product might have a 'default' algorithm for weighting things like DA and PA of each page-one result vs. the client's DA/PA, but might also offer the ability to choose custom weightings for DA and PA vs. client DA/PA; the amount of deviation of SERP DAs/PAs from client DA/PA; etc... as well as possibly other factors that can reasonably be estimated on an automated basis (such as number of EMDs, etc.). For example, I might choose my weightings as 50% DA, 45% PA, and 5% EMD (simplistic I know, but you get the idea).
I believe that such a custom-weighting option would offer tremendous value for more advanced SEOs (as everyone has an option on how things like DA, PA, and other factors should be weighted).
Hope we see something resembling this from Moz (ideally) - or someone else using the Moz API - in the not-too-distant future. It would be a huge time saver in my opinion and I believe would offer terrific value. I personally would be more than happy to pay a reasonable monthly fee for such a tool, and perhaps many other SEOs would as well.
Thanks for sticking through this lengthy response -- much appreciated! :-)
Thanks for this insight, I do something similar but instead of grades I simply list the number of "beaten" results and the number of total results (excluding features and similar results).
You don't have to be a programmer to do this for bulk keywords, there's still the option to use URLProfiler's "Simple SERP Scraper" tool, then paste those results into a Google spreadsheet that has the MOZscape API enabled. Both are easy to find, but if you need I can throw some links here
Yeah, definitely - it's important to note that the current metrics are relative to other keywords, not to the campaign/client. Keyword Difficulty tells you whether one keyword is more difficult than another, but not how difficult it is for you personally. You can use PA/DA as an approximation for that, but the real answer is, of course, a much, much more difficult problem to sort out. For now, our main goal was to help people better prioritize a large set of keywords. Looking for a non-brand example now, but site-links are usually on dominant intent queries, which are often brand-related.
I'm honestly not entirely sure what you're looking for with the Century 21 example, but here are some non-brand (or, at least, ambiguous) queries with multiple site-links and other features: "multiplication", "blender", "polar", "torrid", "anthem".
As I just noted to Rand, he gave me insight I did not consider - If I am looking for Forever 21 I clearly want their site, I want their locations, I want any specials and maybe their social media outlets - I can see maybe Yelp but the rest is really noise to me - not to those parties but to the searcher it is just noise. We could argue that Dunn and Bradstreet could be there as well as BBB or others that have information about them but not really the context of the query. The Century 21 example would show legitimate brand claims to the phase beyond those that have content related to them. Then the 3 available listings make sense.
This makes me want to work at Moz. Seriously. And there goes my big "million dollar" idea :). Always envious of the Moz ability to bring these things to life! A few years ago I started sketching out parameters for a tool I nick named SERP Sniper. A handful of the features you have here are very close to what I thought would be extremely valuable. So from my somewhat biased perspective--well done guys and thanks!!
The SERP Analysis card specifically is gold in my opinion. If this could be combined with a suggested intent behind the SERP (overview, transactional, informational, etc.) I think there'd be a fair amount of value. I can't tell you how many times (and I'd love to see this added) I've wanted to load up a group of KW's, get feedback on how many times "x" feature appears across the SERPs for each KW as well as get aggregated scoring around how many of the SERPs appear to be more informational vs transactional vs overview/broad, etc. Lots of uses here for planning content, analyzing query intent and so forth.
Final immediate thought: I'd love the option to adjust the importance score based on on-site conversion data. I have some thoughts I'll be messing with on how I might create a model based off of some of what you've got here that allows for that type of input to be added and adjust scores appropriately.
Awesome stuff that I look forward to playing with--I honestly haven't been this intrigued by a tool in quite awhile.
Thanks! Yeah, funnel metrics are definitely something we'd like to factor in as we tie KWE more closely to Moz Pro. The trick, of course, is we've got to plug into GA or something similar at that point, but it's theoretically possible.
Makes sense. Obviously lots of ways to keep developing this. I'm a big fan of your move to work on topical research as it relates to keyword research as well. All good stuff and excited to see where you take it next.
No mathematical equation is going to be 100% correct in justifying the value of a keyword, but it's nice to have a reliably calculated value as your formula provides. I will definitely try it out, thanks!
I think phase 1 is always to find a logic that's consistent and mimics our own, manual processes. It won't be perfect, but it can help provide objectivity and speed up our work. I think the best tools and metrics aren't the ones that try to replace our own intelligence/intuition, but complement it.
Thanks for this Dr. Pete! Great to have a nice document to refer to in the future about each of the metrics we provide.
Great post Peter, of course keyword research is very important for the success of a Seo project, and in most of the cases it is not done well and we guide for gut feelings.
What I really like of the article is the math formula you suggest and the end to calculate the Keyword Potential, in this way you put aside the gut feelings
Thanks for the great post. Do you have an idea, when the keyword explorer will be usefull for german, dutch or french keywords?
Thank you for great post, it made me to take a closer look at Keyword Explorer and it definitely helped to realize the task of its metrics better.. Indeed, keyword research remains important and not fully studied field of seo that we all have to take into consideration. Another interesting thought is to calculate keyword potential with formula - fundamentally new approach that I believe we are going to use soon:) After reading the article, I launched Keyword Explorer and analyzed some keyword groups I`d like to set in current project and then I caught couple of new ideas which probably will be apposite.
Thanks for sharing this great keyword research post. This post helps me a lot for my keyword research task and I've learned new things to use when doing keyword research.
It's amazing that something as basic as keyword research is still so problematic in 2016. A few years ago I did a major research project in the UK DIY industry and every single method of getting keyword volumes other than Hitwise was spectacularly wrong.
So it was while building our own internal tool that Moz launched KE. A lot of things I've been calculating tally up with Moz's data which I got quite excited about. Moz's keyword explorer might be the first affordable tool for SMBs that actually provides a real degree of accuracy. The only reason I've not subscribed yet is the volume of kws I can track on the top package would not be sufficient for what I need.
Reading many of the thinkwithGoogle case studies, it's important to not equate one search with one potential sale.
Market opportunity research in search is a minefield. EG. U.S. shoppers make 1-5 searches on average before making a purchase. Bigger purchases like cars could mean 100+ searches over a month before they eventually buy the car. And that's before we get to the difference between clicks, sessions, what your conversion rate is based on, CTR per position per device, and how dramatically seasonality changes the volumes. The Q4 volume spikes are insane in the U.S. compared to the rest of the year.
Longtail data is possibly the hardest to evaluate as many services from SEMrush to SearchMetrics don't provide what you would hope for or expect, despite many of their excellent benefits.
Why does Google continue to make it so hard to accurately measure and carry out what we do? How much organic traffic from iOS devices is being recorded in analytics as direct? Not everyone has enterprise level budgets so if Moz's new tool helps level the playing field, it's good news for everyone.
Great Post to find the good SEO traffic keyword & research for your business site Thanks @Dr. Peter J. Meyers
It must be continuously recycled . To me this is what makes our profession attractive . :)
Good morning Peter,
That post was reeeeally interesting. I didn't know that there was a math formula to find the best keyword! Personally I work with small businesses so I may choose a keyword that doesn't have lots of searches every month, but that I know for sure that if it has 50 searches, maybe a high percent of them will be conversions. Those low numbers are usually followed by low cpc and low competition, which is good.
I think it's also a good thing to do for those small companies that'd be competing with big brands, like if you're selling online t-shirts, you need to find some more specific keyword that'd allow you to position better, like "cute online t-shirts", "original handmade online t-shirts..." I think that long-tail keywords are great for those companies.
Congrats for the post :D
Anahtar kelime için oldukça iyi bir uygulama.
Thanks for a great article Peter!
Thoroughly enjoyed reading it.
I think this is a solid way researching keywords, but what about all those keywords (semantically close to your researched kw) that are hidden from you in Google Cluster data? How would you address that? And thanks again for this amazing article!
Competition on the Internet has exploded in the past few years - it seems as though the number of results has grown exponentially but the number of searches barely inches up. The folks at Moz have often cited the necessity of sorting keywords into groups; if you could make the Keyword Explorer tool smart enough to do some of that sorting, how awesome would that be?
As far as I think we are moving from the era of keywords search to Keywords Guess. People use more longer and precise Keywords to search for the results. You can get some idea from historical data and build your variation of keywords based on it to target.
A very detailed ivestigación in this 2016 along with keywords that focus on a reliable and great discovery.
I read carefully this post and I liked it, I am passionate about SEO, I will try its mathematical formula in a short time
so, what are differences bt "keyword difficulty" and "difficulty" of kw explorer?
Hi Peter.
KW research is very tiring and consumes a lot of time.
Of course, I do not mind doing it, as it is part of the job.
But the worst part is, after taking so much trouble to find the right KWs, one does not get traffic.
That sucks.
Regards.
Veena
Brilliant Post Dr. Pete! Definitely not your "old-fashion" Keyword Research - Your insights are always welcome! :)
Thanks Bill for this very informative article regarding best keywords finding trick. I always read your articles and follow all activities that mentioned in your topic. Thanks again
Great Post, Peter !!
True. Any SEO must evolve as much as evolve SERPs. And soon reach the days where organic SERP results are shown in the least possible, so the choice of the main keyword should be more comprehensive
Peter, this is great. I found this really helpful and I am definitely going to try your methods out.
So are you basically saying that Keyword Difficulty (V1 or V2) doesn't use DA at all in its calculation? And why is that?
We use PA where available, and DA in some edge cases, since PA tends to correlate better with the ranking of a particular page (as it's designed to do).
"Our metrics and tactics evolve as SERPs evolve." Well said, Peter! That probably sums up your thoughts in this post. IMO, Keyword Research can become an overwhelming or simplified process based on how an online market researcher looks at her business and where she intends to deliver unique value to her customers. For many, the keyword search is overwhelming because they are confused as who they want to cater to and they want to compete with. As you have mentioned comically, after doing all the hard work to research your target keywords, you might as well realize you've been wrong with your approach. The point is you didn't start off on a local note to being with.
Hi Dr. Pete,
Thanks for deconstructing this! For me at least, there was some ambiguity around keyword potential following Rand's most recent WBF - This really helps put the new tool into perspective. Our team uses the Keyword Effectiveness Index (KEI) to measure opportunity/potential. Given, we use the KEI calculation with a grain of salt, I'm wondering how much value or benefit the community places in KEI when it comes to choosing keywords for modern SEO?
Thanks Again!
Hi Paul - can you explain or link to what you mean when you say "KEI"?
Hey Rand,
KEI is a simple calculation that divides the number of results (pages indexed) by the monthly search volume.
For example - if "Coffee Beans Online" has 42,900 result pages indexed and a monthly search volume of 3,170 the KEI would be 13.5. Meaning for each searcher there are 13.5 pages competing for that user. It's a bit crude, you can refine it a bit by using search parameters, Just wondering if it's still a viable model to gauge effectiveness.
Potential is tricky. Even with our fairly narrow definition, you have to make a lot of assumptions about CTR. For any given, single SERP, I think a human is going to have much better intuition than a generic model. However, we're pretty bad at using that intuition across 100s or 1000s of keywords, not just because of the time it takes, but because that intuition is hard to quantify. That's where I think the model shines - relative comparison across medium to large keyword sets to help us quantify something like our own intuition.
We're definitely hoping to refine the model based on real CTR data and feature interactions, though, over time. We also plan to continually review the data the models produce and make sure it reflects our own combined SEO experience at Moz.
Hi Pater Great post and insight about how MOZ Keyword tool work and matrics behind it. Quick question, if the keyword have less volume but possible high conversion. How we can identify those type of keywords by using any tool.
Thank you
Thanks for sharing.
It has been an interesting ride in the SERPs data changes over the years. When map listings came out in 2008, it changed the game and forced everyone to reevaluate local marketing. From A-G and now 3, who knows what is next when it comes to local.
I think many companies are still blasting out 100's of keywords in a single market and then sit and wait. We all know this is not the best strategy, but these are small businesses who are trying to do it on their own and end up lost overtime.
The KWE tool really is helping and we have sent a newsletter out to all of our clients pushing it. Very cool guys. We like it.
I love the keyword explorer tool! Thanks for breaking down the details of the tool here.
Has anyone made a spreadsheet for the math formulas in the article? id love to have it. Trying to do it myself righ now but I never done it before.
Writing something or the other centered on a keyword alone is similar to restricting oneself to a corner, without having the option, scope to move freely, stretch hands, minds etc. However, if we talk about digital writing in the present scenario then it too has started looking beyond the realms, boundaries of only and only keyword-related writings. For a written piece or the other, the most important thing for it is it should be interesting, exciting enough for a reader to ultimately read it. When a reader, decides to read a particular write up or the other, either getting attracted by its heading or pictures in it etc., that in itself proves that it is definitely serving some purpose or the other towards one, whether it is informative or a promotional content, are separate things altogether.
Thanks, I think this is the key sentence: "You can use a meta-metric for sorting and prioritization, while still keeping all of its original components for deeper insights."
Having a quick way to compare value between similar groups of keywords spares you time, but it's important to remember where the logic came from and to always be able to refer to the original data used in the formula.
Sometimes just looking at two more simple metrics and using gut instinct (in terms of not calculating them into a single number) can draw a much clearer picture of keyword potential.
Any isolated metric will sooner or later start misleading you and make you lose sight of the big picture - even (or especially?) if it's your own calculated metric.
The more I dig into metrics/analytics and we invent our own, I think the answer to whether to use deep or broad metrics is: "both". There are times you need to slice-and-dice and there are times you need to dive in. I think the best tools in the future will allow you to take both views. Drilling down to one metric can be useful for comparison, but then you also need to be able to see the pieces. We're seeing this in content and social metrics, too.
Absolutely, great point.
Fantastic post on not only showing the benefits of these metrics and how they have been integrated into the new Moz tool, but also how as SEO's we can better understand ways to let the client know exactly what needs to happen and why we need to do it in terms of targeting keyterms. I think it also lets the person who is doing the work understand the amount of energy it is going to take to get the job done, so for any non agency SEO's setting realistic client expectations is easier when using these metrics.
Great post Peter!
Thanks Dr. Peter J. Meyers,
These tips are useful and somewhat will help me to look for keywords that are really easy to rank.i would love to built keywords list to try this strategy.
My strategy to find keyword was somewhat different, I would simple search with keyword and look for websites which had low domain metrics. Extract keywords from them, with the help of Semrush and target the keywords in which the site ranked on top. .
Thanks and regards
Pulkit Thakur
Thanx for sharing!!!
Thanks for sharing really wonderful post .
Amazing post. Very good technic, our marketing team will implement in our next keyword research. Thanks for sharing
Informative post for small business owners, Thanks
Thanks for the great pos!!!!
Thanks for sharing.
I use this started in 2016 and the software is very simple and easy . website that we use is tiketturindo.com
<a href=https://tiketturindo.com> Tiket pesawat </a>
And you just gave the duplicate master key. Great write-up Doc.
Thank you. Very useful but how to judge the value of a keyword?
Nice Post!