As a consultant, I work with many In-House SEO teams with strategy and other issues that arise throughout the course of the year. One trend we are seeing is that these In-House teams are having a hard time coming up with accurate traffic-centered goals. Traffic is the base for many metrics measurements, so being able to semi-accurately predict that number in the coming year is important for every business.
I can hear you all now, "Well there is the Google Keyword Tool ... use that." Typically, that is my answer too, but there have been major questions about the accuracy of Google's keyword tools and others available to webmasters, marketers, and search engine optimization teams.
(If you will comment with your favorite keyword tool other than those I mention, I'll happily test and add it here!)
The Google Keyword Tools (yes, plural)
There was a shift recently with the Google Keyword Tool. The Legacy/API version is showing different numbers than the newest Beta interface. David Whitehouse and Richard Baxter both noticed this shift as well and did a few tests on accuracy. The jury is still out as to which version is more accurate, the legacy or the new keyword tool. But I believe like Mr. Whitehouse that the newer tool is the updated one, but that does not make it more accurate.
To be clear, when I speak of the Legacy, API, and Beta tools, I do mean different versions of the Google Keyword Tool. First, from what I can see using the SEOmoz Keyword Difficulty tool, the Google API pulls from the Legacy tool, so they are one and the same. The Legacy tool is the prior interface for the current Beta version of the Keyword Tool. We had previously assumed that these pulled the same numbers, but my research and that of others proves otherwise.
But wait! *infomercial voice* There is more!
There is also the Search-based Keyword Tool that aids AdWords advertiser's in choosing relevant keywords based on search behavior and a specified website. This tool is explained by Google here and gives more in depth information on account organization and cost.
But even this tool is not on par with the other two when it comes to impressions. A random query in the Search-based tool returned a suggestion for the keyword "maragogi." The Search-Based tool says there should be 12,000 monthly searches. The Legacy tool returns 110 Local Exact match searches, 33,100 Global Exact match, and 201,000 Global Broad match. The new tool returns information only for a global setting (all countries, all languages). That returns 74,000 searches broad and phrase match, and 12,100 for exact match. It seems like the Search-based tool is more like the exact global match in this one instance. But what is a business supposed to do with all of these numbers?!?!?
(hint: always use exact match)
Back to Strategy
If these tools are possibly inaccurate, how do our clients go about setting their yearly strategy goals?
Put simply, in search, you never want to rely on one set of results or one ranking report. Data over time and from many sources is best. But with the lack of tools out there and Google bringing in at least 65% of traffic organically for most sites, how do you get the best numbers?
Impressions
First, you need to start out by figuring out how many impressions a keyword or set of keywords can bring in on average for a specific month. If you are in a cyclical industry, this will have to be done per month of the calendar year.
1. Pull from both Google Tools and other Keyword Tools
The idea here is that if you take into account all of the numbers out there, you might see a trend that you can use for estimating future traffic. If there is no trend, then a median of the numbers can be used as your metric. A few other tools that you might look into include Word Tracker and Keyword Spy. You can see that the numbers are all over the place, but looking at these figures, I'd guess that the keyword might bring in around 6,500 impressions a month in the UK.
The downside is that WordTracker and KeywordSpy don't allow you to look at exact match information versus broad match. When performing keyword research, you always want to look at the local (target to your country) exact match information. Too many people pull keyword information use broad match and get inflated numbers for all phrases related to that key phrase.
2. Run a PPC campaign if possible.
The absolute best way to get accurate numbers about traffic over time is to run a PPC campaign. I pulled some numbers from a few campaigns (for our client's sake we have masked a number of the actual key phrases) in attempts to see if the new keyword tool is accurate to actual trafffic in August. The keywords pulled were all exact match in the campaign and the information pulled from the keyword tool was Local Exact and set to the country that the campaign was targeting.
Some of these are higher and some lower. What I found that there really is no definitive answer of if the Google Keyword Tool is accurate. Take a look at the results for the example I used before, curtain fabric. The campaign saw 11,389 impressions, much higher than the new keyword tool, and lower than some other keyword tools. This is why a well run PPC campaign is important if you want to get a more accurate look at impression numbers.
Please note that I didn't get a chance to ensure that these accounts were all showing at all times during the month, but they were all accurately geo-targeted and all showed on the top of the first page on average.
Finding Traffic Based on Rank
After getting a good idea of the number of impressions, you then need to take into account where you are showing for that keyword on average organically (aka your rank). While we cannot know specific click through numbers for every search done on the web, there have been some studies done on how much of those impressions the top organic result gets, the second and so on. The one I used the most often is from Chitika. Using the percent of the traffic and the impression numbers, you should be able to get a good idea of the visitors you can expect per month organically for a specific key phrase.
So using the "curtain fabric" example, assuming that the site I am working on has maintained an average ranking over the last few months of #3 organically, I could expect about 1300 visits from Google for the keyword in a month (11.42% of 11,389 impressions).
Past Metrics
Once you get everything figured out, keep in mind that your past metrics are another good way of seeing how close you are to getting the traffic about right. Assuming that no major changes have occurred (like lack of metrics data in the last year), a look back is the most accurate way to understand traffic flow and trending on your site. Pull the unique visitors for every month of the last year and do some analysis on percent increase month over month. This can be done on any level in most analytics programs - overall traffic trends all the way down to the keyword level.
Educated Guesses
In the end though, making predictions are just that, educated guesses. Pulling data from all available sources and using your own historical data can assist in making an educated prediction for the next year. Keep in mind though that things never stay the same. Google Instant just proved that with one of the biggest changes we have seen in a while.
I hate to be correct sometimes and unfortunately this is one of that time. Kate your traffic estimation startegy has many big flaws. First the Legacy tool data is highly inaccurate (up to 90% or sometimes even more). Google is notorious for showing inflated numbers. Ask anyone esp. in the PPC department. So you need to apply certain corrections. Second the focus should be on estimating clicks to your site and not number of searches. Third there is no correlation between clicks from a PPC campaign and those from organic search results.
There are numerous eye tracking reports and other researches to prove that. Moreover everyone knows that people click more on organic results than on PPC results. Fourth your traffic estimations are based on chitika research. As long as i know chitika research has not taken into account blended search results, local business listing, type of search query, different industries and branded search listing (like amazon) which dramatically skew the results.So there conclusions are flawed. You can verify this through your Google webmaster tools 'search queries' report. Please enlighten me if i am wrong. Explaining traffic estimation any further is beyond the scope of this comment. But i would like to add one thing. Don't build search campaigns which heavely depends upon the traffic from the few so called high volume keywords. Concentrate on long tail keywords and use your internal site search reports to target new keywords.
Hey Himanshu. You mentioned "Third there is no correlation between clicks from a PPC campaign and those from organic search results." Can you elaborate on that? I've always had the opinion that a PPC campaign was one of the most accurate tools to determine keyword usage for sites (excepting of course the keywords discovered from your log files/analytics reports)
If users click less than 20% of the time on PPC and over 80% of the time on organic, then in my reasoning 1000 impressions in the PPC campaign would translate to a minimum of 5000 organic clicks.
I don't know how marketers got this notion that PPC campaign is accurate in determining traffic estimates for organic search. Yes PPC is one of the best tool to test transactional keywords and to determine converting keywords. But how much traffic comes through transactional keywords. Majority of traffic comes through informational queries which a PPC campaign can't test with any accuracy. According to recent eye tracking studies, the probability of users to click on a paid search result increases in case of transactional query and on organic search result in case of informational queries.Within informational queries, users may go down to upto 10th result on SERP. This is just one example. There is no hard and fast rule that users will click more on organic results than on the paid one every time. It all depends upon the search intent, industry and type of results on SERP (image, local listings, news result etc). I will buy your theory if you can show data that prove correlation between clicks from a PPC campaign and those from organic search results.
I use an initial PPC campaign not to judge numbers, as you are 100% right, ppc and natural search results are different. I use it to guage market intention. A decent click through or conversion helps me decide if it is a market or niche worth looking into deeper.
AOL released search stats a few years back by mistake which gave the avergae for positions 1 to 3 and click through rates. I'd suggest comparing that data to known #1 rankingings and comparing the analytics data to suggested number of searches. Then take that and data and break it down across multiple results.
That way you could get a good idea regarding the accuracy of the results given for the markets you are targeting.
Hi Himanshu!
First off I am all about being wrong. Being wrong means I am still growing so I welcome any edits to my posts. It's how I learn and I know the Moz community and the larger search community is much smarter than I. :)
First the Legacy tool data is highly inaccurate (up to 90% or sometimes even more).
Yes, and if I did not make this clear, I agree. The Legacy tool seems to be, in most cases I've seen, highly inaccurate. Which is why I wrote this post.
Second the focus should be on estimating clicks to your site and not number of searches.
If you read all the way through I do focus on click to the site. But to determine that you do need to identify a number of searches possible for the base metric.
Third there is no correlation between clicks from a PPC campaign and those from organic search results.
I don't believe I ever used numbers that were from a PPC campaign talking about clicks or CTR. I merely looked at it to gather a number of impressions, or searches, for a particular keyword.
Fourth your traffic estimations are based on chitika research.
Yes, I chose that because it was the most recent research, but I am happy to point out anyone else's recent research that for sure includes blended results and the like. If you have a better updated research set, I'd love to see it. I've been looking around for more.
You can verify this through your Google webmaster tools 'search queries' report.
Even these numbers are not accurate as my co-worker Tom pointed out.
Don't build search campaigns which heavely depends upon the traffic from the few so called high volume keywords. Concentrate on long tail keywords and use your internal site search reports to target new keywords.
I agree with you that focusing on just a few keywords is not the way to go. There are so many out there. But not everyone utilizes internal site search and there are so many options available for key phrases that one does need external assistance guesstimating the traffic for any given keyword. In the end, every search marketer needs to use all tools available and use logic to determine what keywords are best and what the potential traffic corresponding in the next year might be.
Thanks so much for your comment! I am always open to any new data and tools in regards to pulling keyword traffic. If you have other ones, I'd love it if you would share. I'll happily attribute via link who mentioned any new process or tool.
Kate
Good post for sites with a long head (high proportion of traffic from few generic or branded keywords), but it would be interesting to see a post on how to estimate traffic for large ecommerce sites with a very long tail.
I am totally with you. With the long tail and with many keywords this is a long arduous process.
What would you think about using a percentage based on historic proportion in these cases? Figure out the top portion and how much the long tail accounts for your traffic in years past, and then applying that percentage (or a pulled out trend) to the current data on your head keywords?
Another idea is to seperate keywords into groups like brand vs. non-brand, research vs. direct response, head vs. long tail, products vs. categories, etc. so as to make them more manageable. Then you can use percentages (as Kate suggests) in order to create predictions.
Dear Kate,
thumb up for putting on the table one of the hardest problem as an consultant I have to solve, also because it's a classic question I've to answer from the really beginning and especially something I've to deal when someone asks to use it also as a guarantee factor.
As you say, I usually use the Keyword Tool by Google... and I too have noticed also big discrepancies between the different versions of it.
Personally, as you suggest, whenever it is possible I use PPC also as a traffic discovery tool. In those cases, apart from using only exact match keywords, I also beware to create that "special" campaign only to be delivered on Serps only and Google only Serps. I think it is something to be noticed, because by default Google publish the ads in every channel (Google Serps, Associates Search Engine's Serps and Display as text ad).
Two questions:
What does make you prefer the Chitika Impression/Ctr tables instead of the classic AOL one?
I actually think that there isn't a great way to tell with so many things going on with each search result nowadays, but this study was more recent and I think takes the new interface more into account. I'd be fascinated in a new one in the coming months with Google Instant.
And do you think that GG Instant at least will have the effect to change the % to advantage of the above the fold results?
I am not exactly sure what the impact of instant will be. It's another way to push down organic results even further below the fold, especially for certain popular terms that show all integrated results. Personally, I think Instant will change the numbers overall, and they will drop, in some instances dramatically. But only time will tell and it'll be harder and harder to tell what results people are seeing. We can turn off Instant, but I am assuming few searchers actually will turn it off.
So I am thinking the percent for above fold will stay in advantage against other organic results but I am not sure it will change it to the advantage of those results pre-Instant.
And thanks. This is such a hard topic to speak to since there is no definitive answer. I am really interested to see what everyone has to say. With things changing so much in Search and with the tools available to us, this conversation really does need to happen.
Thanks for the answers Kate...
Ah, before I've forgot to tell why a PPC campaign is mostly the better solution in order to investigate traffic volume:
let's say you have a websites who focus an geo-targeted market. It not rare to see how the Google Keyword tools show you a nice "--- " when it comes to geo-targeted keywords, that means that they haven't a consistent amount of datas about them (the same, and on a larger scale, with GG Trends and equivalent tools). In that cases, or you give a try to a tarot reader or you have to collect those datas from a PPC Campaign.
In that cases, or you give a try to a tarot reader LOL Ginaluca. You are forgetting another excellent source for inspiration. The Magic 8 Ball.
For real. I hate it when there is the -- answer. I got that on one of my tests this time. But I kinda get why there is no answer? Kinda. But in a perfect world I'd like to see the real search numbers. Oh well. :) But you are so right, this is where PPC can really help SEO.
And what about data form Google Webmaster Tools. There you can find some great input about impressions and CTR. We did some tests with CTR in comparison with Analytics. The CTR was very accurate with the actual visitors.
I was going to bring up the same thing. It's interesting to see the difference in #s between webmaster tools and the Google keyword tool. The biggest keyword we rank for on the first SERP shows 46k individual searches a month but in webmaster tools, it shows impressions much lower than that.
It's great to compare but it still makes you a bit wary of basing too much confidence on those metrics...
Hey Kate
Totally with you on running a PPC campaign - we do a fair bit of market research for startups, who don't know much at all about user search behaviour in their targeted niche. The PPC impressions really help us firm up how accurate (or not, as the case may be) the GKWT data is.
There was often a strong difference between the API output and the external KWT, chiefly because the API volumes are annual averages while the GKWTe is volume data from the previous month. If that situation has changed then it has done very recently.
I think that the real trick with managing KW research data is to understand the limitations introduced by each source. There is no one correct answer, sadly.
BTW - it's nice to be able to comment again. Great skills to Adam and team for getting the new site up and live!
BTW - it's nice to be able to comment again. Great skills to Adam and team for getting the new site up and live!
Amen to that Richard. They also fixed another posting issue. For years I've used the non-WYSWYG editor because there was a really funky bug in it that would throw paragraphs below your cursor when you hit return for a paragraph spacing. It was easier to just use html when composing posts.
Now, the non-WYSWYG editor doesn't let you make paragraphs at all. Or rather it does let you make them, it just doesn't retain them when you publish your comment.
Don't mean to hijack Kate's comments but I'm mentioning it because it sounds like you and I had identical issues in not being able to post, and now it looks like you might have tried to use paragraphs in your comment but it all got truncated (completely ignore me and dismiss me as a buffoon if indeed you only created a single paragraph intentionally.)
Hijack away sir, Hijack away :)
Baxter!!! *hug*
Thanks for commenting. I was so happy to see your post. Yours was the first that gave me a sigh of relief that I wasn't going crazy. I am hoping someday we get a better tool to estimate search traffic. SEOgadget wanna build that? *wink wink* (please! :P)
Richard... well... I won't "wink wink", but surely I would offer you a good bottle of Ribera del Duero next month in London if you compromise in studying the possibility of a tool like that :)
Of course proper due diligence would dictate that I go and look for what I'm about to ask ... but I'm lazy. So, after that extended preamble, here's my question.
Isn't there a significant difference between the Google KW tools, where one of them incorporates low-intent search volume from the content network?
I'm pretty shocked only 1 person mentioned Google WMT.
We ran into this problem several months ago and used the keyword tool in conjunction with the older AOL data, (back when we weren't ranking for our target terms) to give a very rough guestimate of traffic for the next 12 months for our major terms.
Wow were we wrong, the Google Adwords keyword tool had inflated every single search term by about 1000%.
Since then we've hit top 5 for the majority of our terms and have been there a while now, we can go into WMT and see almost exact impressions our SERPs are getting and run much more accurate guestimates for the next 12 months. the webmasters tools impressions is almost 100% accurate (we can compare using our analytics) this is the only way we can accurately estimate traffic for a position - if we have a front page SERP for a good couple of months and get an almost exact amount of impressions, from there, use the chiketa chart and we're golden.
On a side note, we run a 6 figure PPC campaign in conjunction with our organic efforts and we used PPC to get us an accurate impressions count (using exact match, search only, highest bidders by far) and even that data shows the google adwords keyword tool to be way, way, way off the mark, not just for a few, but for absolutely all of our main search terms (about 100 of them)
I now stay far away from the adword keyword tool, I'm not sure what useful metrics can be gained from it, unless my industry / niche is a one off (though when using it for freelance work, its pretty much the same story)
Wow. Thanks for that comment! The research I have seen hasn't been as accurate in respect to WMT, but maybe I need to revisit. I'll take a look at those numbers this morning and see what I find. When those numbers had been released though, it was so far off of what we were seeing overall.
You got me thinking now ... if you have any examples you can share via PM, I'd love to see them.
And yes, it really is amazing how far off estimates are in keyword tools across the board.
And I had the same issue pulling info from AdWords. Finding good traffic exact match keywords with available data was hard! Too many campaigns are so well targeted to specific states and regions that I couldn't use those either. (AdWords, Google, can we please please have info on traffic locally? Just sayin')
pm sent :)
Hey Kate. I'll keep this short as I seem to not be able to stop writing these epic length comments today. I really enjoyed reading your take on estimating traffic. Like Gianluca said above, it's one of the harder aspects of keyword research. While it's not an exact science, I appreciate you trying to tackle it head on. Thumbs up.
I prefer using the Google Adsense keyword tool optimizing, among other factors.
Can you provide a URL to it? I can't seem to locate an AdSense Keyword Tool, but would love to test it out!
Hey Kate,
You might want to revisit that last image you posted your blurred out the keyword in one place only however there is a second place that it is listed might wanna go back and change it
I did. Thanks ;)
"The absolute best way to get accurate numbers about traffic over time is to run a PPC campaign."
I fully agree however when taking the numbers from your PPC campaign there are some points to look out for:
1. Don't include content network data
2. Because of the fact that Google's keyword tools are meant mainly for PPC purposes I would think that the numbers shown on the tool include data from search engines partnered with Google such as AOL. Also - data from your PPC campaign will usually include searches from AOL and other search engines unless you disable them from your campaign settings page on Google Adwords (which most people don't do).
So - If you want data purely for Google the best way would be to run a PPC campaign that is not opted in to the content network or Google's search partners.
3. Your PPC campaign ads don't necessarily show 100% of the time for the given search term.
________________________________________________________________________________
Another point I wanted to make is about the percentages taken from Chitika - The data is probably reliable for a lot of search terms however there are specific types of keywords that would probably not fit into the percentages Chitika gave.
Take "fox news" as a keyword for example. I would think that the percentage for the 1st position which is foxnews.com would be higher than 34.35% because it's a brand name.
Also - topics which people tend to research a lot such as "cancer treatment" would probably mean higher percentages for lower ranking sites.
I'm sure you guys can think of other groups/types of keywords that would yield different percentages.
Nice overall post, but have to agree with some of the comments e.g. seo-himanshu, and to me probably just a bit generic.
For example the "past metrics" you are talking about view of month on month, but not year on year which if available is much more accurate traffic predictor, due to seasonality in majority of the markets.
Thanks
Great point! I didn't mention year to year I think because I hear too often about companies not having more than a year or two of metrics since they change analytics packages so often. But you are right in that the trend over time (years) along with month over month, is integral to seeing traffic patterns.
This is a generic post, but as I dug into the data from clients, I realized that crafting the perfect post would take a few months of building a test case. Which I might still have to do seeing as there is a good amount of interest here.
Good topic to look at. I use the Adwords tool and between different versions there can be discrepancies which can be confusing when discussing with perhaps a designer who has already writtent the numbers out. The tool is a great method of getting the required information but even on "exact" searches you should take them as a guide only.
Good stuff Kate :)
One of the other ways that I tr and predict volume is by using insights (https://www.google.com/insights/search/#) Especially if one of my keywords is in top position for a high traffic term. I would then plot other keywords against the one that I know what the value of a pos 1 is, and then use the weighted figures that google offers against that volume to predict the potential for othr keywords.
The beauty is that the keywords may not need be of the same subset to give you a fair indication. The only limitation is that this strategy only works for high or fair volume keywords, but makes a very decent argument for high rise SEO budget reqests.
I also use PPC Campaigns to validate the numbers of the KWTool, but there are a lot of factors that weren't mentioned here.
First of all the purpose of the PPC Campaign should not only be to get the "exact" number of impressions, but also provide insights about the quality of the landing page and the relevance of the specific keywords (e.g. through Conversion Rate & Bounce Rate). Even if Ad-Clickers tend to be more buy-oriented than "Organic-Clickers" this method is an excellent way to check the performance of preselected keywords.
Furthermore it is clear that the number of impressions couldn't be 100% accurate. There are several reasons for that: First of all, even if you have the highest bid that doesn't necessarily mean your ad is always on the first position, since the displayed ads also rely on the google profile and the click behavior of the searcher. In addition to that we have seasonal effects for most of the keywords. So a PPC campaign in march doesn't give you information about the search volume of a keyword in august. Certainly Google Insights for Search is a good way to adjust this numbers over the year.
In addition to that I wonder why nobody mentions that the number of impressions in a PPC campaign can't be translated 1:1 to the number of clicks on organic results. The question is how many people click on ANY result instead of starting a new search (which would be one impression but no click) and how many people click on a PPC result instead of a organic result (which would be another impression without an organic click). If I remember it right the data provided by AOL some years ago pointed out that nearly 50% of all searches result in a new search instead of a click. Therefore the number of impressions must be reduced by the predicted proportion of PPC clicks and searches that result in a new search (I guess it's a lot less than 50% now).
Anyway, I think PPC Campaigns are a great way to get information about the search volume, but you need to know how to handle them.
Hi
Very nice article...the only issue I have with this is that PPC customers and Organic customers behaviors seem to be different at times and specially according to industry.
For example, one of my clients is a payday lender and they rank well for term "paycheck advance" and also have PPC campaing ..on organic they get far more clicks but PPC customers seem to convert better...
anyway it's a thought I throw out..
LOVE THIS TOPIC. It's a bit hard to understand why Google's keyword tools don't really do the job. You can easily end up buying a domain based on faulty numbers that won't get any traffic at all. I wish Google would make their tools easier to use and understand for someone interested in e-commerce. SEOs should not be writing exposés of their keyword tools year after year after year...
Actually, maybe this is a good opportunity to ask a question about the Keyword Difficulty Tool I've been wanting to ask for a week or two...are the changes in competitiveness just the result of Google changing their tools? The tool rates my market defining keywords as a lot more competitive than it did a few months ago!
Hi there - good question.
The KW Difficulty tool on SEOmoz doesn't use number of searches or search volume as an input for the score (as we find these often have only a small relationship to the actual challenge of ranking, and, as has been pointed ou above, frequently shows some questionable numbers). Instead, it leverages the PA/DA scores for the top ranking web pages/sites in a weighted average to help show the link-popularity and quality those sites/pages have achieved. We've found this to be generally analogous to how difficult it is to push those top ranking results aside in favor of your own work.
The shifts you see happen every Linkscape index update, when we re-calculate our metrics and determine the appropriate scores for each page/site in our index. Due to this shift, KW Difficulty scores are best to compare on a regular basis, as the results ranking may change, and the links they've acquired may as well.
Ah ha! Thanks for answering my question Rand I appreciate it. Interesting to know how the score is calculated...I like it better than Google's extremely exaggerated (and to me meaningless) competition scores.
Hey Kate:
Here's some additional insight on the often huge search volume difference between Google's Search Based Keyword Tool (SKTool) and Google's Adwords Keyword Tool (GAKT).
In our b2b market, the difference in search volume between these tools can be extreme. For example, the search volume for "environmental equipment" is 12,100 searches a month in GAKT, and only 155 searches a month in SKTool. Needless to say, that's a huge difference, and a bigger difference than can be explained by "exact" versus "broad" matching.
We've found that SKTool is much more representative of actual "organic search" volume, whereas GAKT appears to include "searches" that are not at all organic, and may even include click-through rates from Adwords campaigns and other paid listings.
Check out our post "Google's 'search volumes' estimates are not what you think". There are also some good examples in the comments. I know Google hasn't been very clear about the difference between these tools, and that's unfortunate. But, understanding the difference can add a whole new dimension of insight, and much better estimates of SEO opportunities for clients.
Thanks for your post, I found it really useful. It's always difficult to predict which position will we be in few months and even more difficult to find out how many visitors we'll win. Good job.
Dear Kate,
I thoroughly enjoyed this focus on objective objectives if you will! I think it is very important to implement numerical goals that are both challenging and attainable and that are also a manner in which your employer or agency can measure your performance in an objective manner. I have been striving for this sort of goal-setting agenda where I work and it is encouraging to note that this is a methodology that is gaining traction elsewhere.
Good luck and thanks,
Matt