When we started reporting the Google “weather” on MozCast, we knew that one number could never paint the entire picture of something as complex as the Google algorithm. Over the last few months, we’ve been exploring other ways to look at ranking data from high altitude, and have reported on metrics like domain diversity and EMD influence. Today, I’m happy to announce that we’re rolling out five of these “top-view” metrics on MozCast, updated daily.

From the new “METRICS” page (top menu), you’ll see five tabs:

Domain Diversity, SERP Count, EMD Influence, PMD Influence, Daily Big 10

Each metric defaults to a 30-day view, but you can also see 60-day and 90-day data. Please note that Y-axes all auto-scale to emphasize daily changes, so make sure to note the scale when interpreting this data. I trust you all to be grown-ups and draw your own conclusions.

So, let’s dive right into the five top-view metrics…

(1) Domain Diversity

The domain diversity graph shows the percentage of URLs across the MozCast data set that have unique subdomains. Put more simply, it’s the number of unique subdomains divided by the number of total URLs/rankings. The more diversity, the less SERP “crowding” – here’s a 30-day view:

Domain Diversity Graph (30-day)

Keep in mind that the range over the past 30 days has been pretty narrow (less than 1%), so let’s take a look at the broader, 90-day view:

Domain Diversity Graph (90-day)

You can hover over any data point for dates and more precise percentages. Here, you can see that diversity increased when Google rolled out 7-result SERPs (from about 8/12-8/14), but has gradually declined over the past 90 days. When we started collecting data in early April, domain diversity was closer to 61%, but it dropped significantly after the Penguin update (on 4/24).

On September 14, Matt Cutts announced on Twitter that Google had made a change to improve SERP diversity:

Matt Cutts tweet

We saw a small bump (about 0.4%) from 9/6 to 9/9, but otherwise have no evidence for major improvements. Please keep in mind that this is one data set and one way of measuring “diversity” – I’m not calling a Matt a liar, and I’d welcome other analyses and points of view. My goal is to create transparency where we currently have very little of it.

(2) SERP Count (“Shrinkage”)

Over a roughly 2-day period in mid-August, Google rolled out 7-result SERPs (for page 1), and our data shows that it impacted roughly 18% of the queries we track. We originally reported this as the number of SERPs with <10 results, but that presented two problems: (1) less results made the graph go up – which is a bit confusing, and (2) that metric doesn’t change if the result count changes. In other words (hat tip to Moz teammate Myron on this one), if all of the 7-result SERPs suddenly changed to 6-result SERPs, our original metric would never show it. So, we’ve replaced that metric with the average result count. Here’s a 60-day view:

Average Result Count (60-day)

In this case, an average drop of 0.5 results is massive, and the graph tells the story pretty well. The 30-day data shows much, much smaller variations, but this metric will help us track any future changes, including a return to 10-result SERPs (if that were to happen).

(3) EMD Influence

The influence of Exact-Match Domains (EMDs) is a hot topic in SEO. Our EMD influence metric shows the percentage of Top 10 rankings that are currently occupied by EMDs. Specifically, if the keyphrase is “buy widgets”, than we consider only “buywidgets.tld” (any TLD) to be an exact match. Here’s the 90-day data:

EMD Influence Graph (90-day)

My recent post goes into more detail and there are a lot of ways to dig into this data, but we’re seeing a slight uptick in EMD influence recently over the past 3 months.

(4) PMD Influence

Similarly, PMD influence measures the influence of Partial-Match Domains on the Top 10. For the keyphrase “buy widgets”, we count any URL with either “buywidgets” or “buy-widgets” in the subdomain as a partial match. This metric does not include EMDs. Here’s the 90-day view:

PMD Influence (90-day)

In line with the broader history reported earlier, PMDs seem to be steadily declining in influence. Keep in mind that this doesn’t mean that any particular PMD won’t rank (they still hold over 4% of Top 10 rankings) – it just means that their overall impact is trending downward.

(5) Daily Big 10

Finally, we have a new metric I haven’t covered in any previous blog post, the “Big 10.” Apologies to college football fans (I’m a former Hawkeye), but I didn’t want to confuse this with the “Top 10.” The Big 10 influence is the percentage of Top 10 rankings accounted for by the ten most powerful subdomains on any given day. This list changes daily, and any single day’s data represents the influence of the Big 10 for that day. Currently, the Big 10 domains account for about 13.6% of Top 10 rankings in our data set:

Big 10 Graph (90-day)

Below the graph for this metric, we also list the Big 10 subdomains for the most recent day. Like all of the MozCast stats, this list is currently recalculated each morning. Here’s the data from 9/18:

  1. en.wikipedia.org
  2. www.amazon.com
  3. www.youtube.com
  4. www.facebook.com
  5. www.ebay.com
  6. www.walmart.com
  7. www.webmd.com
  8. www.yelp.com
  9. www.overstock.com
  10. allrecipes.com

Currently, the roughly 9,500 URLs in our data set (Top 7-10 for 1,000 keywords) represent about 5,300 unique subdomains, so the fact that just ten of them take up almost 14% of the real estate is pretty amazing. Wikipedia alone holds 4.6% of the Top 10 URLs that we track (today). There’s a fair amount of movement in the bottom couple of domains, and Twitter dropped out of the Top 10 earlier this year.

What Would You Like to See?

There are a lot of ways to slice the data and we have quite a few ideas in the pipe, but if there are specific, large-scale metrics you’re interested in, let me know. We’re trying to incorporate community feedback into the product development plan. Also, feel free to make suggestions on the @mozcast Twitter account.

I’d like to quickly thank Devin and Casey for doing the behind-the-scenes work to get this page integrated, and to Devin in particular for turning my single, rambling page of stats into a pretty slick design. Thanks as usual to Dr. Matt Peters for feedback on the math, and to Rand for putting up with dozens of emails and somehow reading them all on top of his other 23 hours/day of work.

Pardon a shameless plug, but if you’d like to hear more about the history of MozCast, I gave an hour-long presentation about it at MozCon in July. The online MozCon videos just went on sale yesterday. Even if you hate me, there’s 16 hours of other great content and you can just fast-forward over my part – I won’t mind, really *sniff.*