Content marketers hear regularly about how quality is far more important than quantity. You can publish a thousand blog posts in a year, but if only three of them are truly noteworthy, valuable, and share-worthy content—what Rand would call 10x content—then you've wasted quite a bit of time.
Here at Moz, we've published blog posts on a daily cadence since before almost any of us can remember. If you didn't already know, Moz began as SEOmoz in 2004, and was little more than a blog where Rand fostered one of the earliest SEO communities. He offered a bit more background in a recent interview with Contently:
"It’s a habit that we’ve had since 2004, when I started the blog. It’s one of those things where I was writing every night. I think one of the big reasons that that worked so well in the pre-social-media era was because the Moz comments and the Moz blogs were like the Twitter or Facebook for our little communities."
We've taken occasional days off for major holidays when we knew the traffic volume wouldn't be there, but the guiding philosophy was that we published every day because that's what our audience expected. If we stepped back from that schedule, we'd lose our street cred, our reliability, and a sizeable chunk of our audience, not to mention the opportunities for increased traffic.
It's now quite easy to have those discussions on Twitter, Facebook, Quora, and other networks, making our old approach an outdated philosophy that was based more on fear of the unknown and a misguided assumption than on actual data.
This May and June, we decided to change that. We're raising the bar, and we want to show you why.
It started with a tweet:
This week, Hubspot published 49 unique blogposts (or ~10/weekday). I wonder if they've tested various quantities and found that to be ideal?
— Rand Fishkin (@randfish) January 9, 2015
The ensuing discussion piqued the interest of Joe Chernov and Ginny Soskey at HubSpot, as they wondered what effects it might have to publish more or less frequently. We decided to collaborate on a pair of experiments to find out.
The setup
The experiments were simple: Set a benchmark of two "normal" weeks, then adjust the publishing volumes on each blog to (roughly) half the normal cadence for two weeks and double the normal cadence for two weeks.
One thing we should note from the get-go: We were always sure that Whiteboard Friday would continue to be a weekly tradition, so we didn't alter the publishing schedule for those. This experiment altered the schedule from Monday-Thursday.
We closely monitored our blog traffic and engagement metrics, as well as subscriptions to our emailed blog newsletter. HubSpot ran their experiment first, allowing Moz to learn a few lessons from their experience before starting our own.
The results from HubSpot's experiment were also published today; make sure you take a look.
The results
We had several central questions going into this experiment, and hypotheses for how each one would come out. There are six parts, and they're laid out below as follows:
- Effects of increased/decreased volume on overall traffic
- Engagement thins as volume grows
- Subscription slowdown
- Community complaints/backlash
- Trading quantity for quality
Important note: We know this is non-scientific. These results are intended to be directional, not definitive, and our takeaways—while they represent our best attempts at progress—are by no means perfect. We want this to be an ongoing discussion, so please chime in with your ideas in the comments!
1. Effects of increased/decreased volume on overall traffic
Hypothesis
Publishing fewer posts each week will lead to a significant decrease in overall traffic to the blog. Publishing more posts each week will lead to a significant increase in overall traffic to the blog. These changes will be proportional to the decrease/increase in publishing volume.
Results
Let's get the high-level overview before we dive into details. Traffic on the Moz Blog can obviously vary quite a bit depending on the content, but all things considered, it's remarkably steady. Here are total daily unique pageviews to all pages on the blog so far in 2015:
Spikes and dips here and there, but we're able to pull a pretty good benchmark from that data. Here's what that benchmark looks like:
Average weekday uniques: |
38,620 |
Average weekly uniques: |
227,450 |
Now, here's the traffic from the four weeks leading up to the reduced/increased publishing frequency, as well as the two weeks at half-cadence and the two weeks at double-cadence (I've also included a line for the average of 38,620):
There's a bit of a difference. You can tell the traffic during half-cadence weeks was a little lower, and the traffic during double-cadence weeks appears a little higher. I'd take the numbers highlighted above in green over the ones in red any day of the week, but those curves show far smaller variation than we'd anticipated.
Here's a look at weekly numbers:
That makes the dip a little clearer, but it's hard to tell from that chart whether the loss in traffic is anything to be worried about.
Let's dive a bit deeper into the two testing periods and see if we can't pick apart something more interesting. You might notice from the above daily charts that the blog traffic follows a regular weekly pattern. It peaks on Tuesday and falls gradually throughout the rest of the week. That's characteristic of our audience, which finds less and less time to read the blog as the week goes on. We wanted to take that variability into account when looking at each day during the testing period, and the following chart does just that.
It plots the traffic during the tests as a percent deviation from the average traffic on any given day of the week. So, the four Tuesdays that passed during the test are compared to our average Tuesday traffic, the four Wednesdays to the average Wednesday, and so on. Let's take a look:
This is a more noteworthy difference. Dropping the publishing volume to half our normal cadence resulted in, on average, a 5.6% drop in unique pageviews from those daily averages.
That actually makes perfect sense when it's put in context. Somewhere around 10-15% of our blog traffic comes from the most recent week's worth of posts (the rest is to older posts). If we publish half as many posts in a given week, there are half as many new pages to view, so we might expect half as many unique pageviews to those newer posts.
That's pageviews, though. What about sessions? Are fewer people visiting the blog in the first place due to our reduced publishing volume? Let's find out:
That's a bit more palatable. We lost 2.9% of our sessions that included visits to the blog during a two-week period when we cut our publishing volume in half. That's close enough that, for a non-scientific study, we can pretty well call it negligible. The shift could easily have been caused by the particular pieces of content we published, not by the schedule on which we published them.
Another interesting thing to note about the chart showing deviations from daily averages: Doubling the publishing volume did, on average, absolutely nothing to the number of unique pageviews. The average increase in uniques from daily averages during the double-cadence period is just a bit over 3%. That suggests relative saturation; people don't have time to invest in reading more than one Moz Blog post each day. (I'm not surprised; I barely have time to read more than one Moz Blog post each day!) ;-)
It also emphasizes something we've known all along: Content marketing is a form of flywheel marketing. It takes quite a while to get it up to speed, but once it's spinning, its massive inertia means that it isn't easily affected by relatively small changes. It'll keep going even if you step back and just watch for a short while.
2. Engagement thins as volume grows
Hypothesis
The amount of total on-page engagement, in the form of thumbs up and comments on posts, will remain somewhat static, since people only have so much time. Reducing the blog frequency will cause engagement to approach saturation, and increasing the blog frequency will spread engagement more thinly.
Results
Moz's primary two engagement metrics are built into each page on our blog: thumbs up and comments. This one played out more or less to our expectations.
We can get a good sense for engagement with these posts by looking at our internal 1Metric data. We've iterated on this metric since we talked about it in this post, but the basic concept is still the same—it's a two-digit score calculated from several "ingredients," including metrics for traffic, on-page engagement, and social engagement.
Here's a peek at the data for the two testing periods, with the double-cadence period highlighted in green, and the half-cadence period highlighted in red.
Publish Date | Post Title | 1Metric Score | Unique Pageviews |
---|---|---|---|
25-Jun | How Google May Use Searcher, Usage, & Clickstream Behavior to Impact Rankings - Whiteboard Friday | 81 | 12,315 |
25-Jun | How to Rid Your Website of Six Common Google Analytics Headaches | 56 | 7,445 |
25-Jun | How to Build Links in Person | 36 | 5,045 |
24-Jun | What to See, Do, and More at MozCon 2015 in Seattle | 9 | 2,585 |
24-Jun | The Absolute Beginner's Guide to Google Analytics | 80 | 15,152 |
23-Jun | Why ccTLDs Should Not Be an Automatic Choice for International Websites | 11 | 2,259 |
23-Jun | Brainstorm and Execute Killer Content Ideas Your Audience Will Love | 38 | 5,365 |
22-Jun | The Alleged $7.5 Billion Fraud in Online Advertising | 85 | 44,212 |
19-Jun | How to Estimate the Total Volume and Value of Keywords in a Given Market or Niche - Whiteboard Friday | 78 | 15,258 |
18-Jun | The Colossus Update: Waking The Giant | 62 | 14,687 |
17-Jun | New Features in OSE's Spam Score & the Mozscape API | 10 | 1,901 |
17-Jun | How to Align Your Entire Company with Your Marketing Strategy | 44 | 7,312 |
16-Jun | Dissecting and Surviving Google's Local Snack Pack Results | 15 | 2,663 |
15-Jun | Can You Rank in Google Without Links? New Data Says Slim Chance | 81 | 15,909 |
15-Jun | Study: 300 Google Sitelinks Search Boxes - Triggers and Trip-Ups Analyzed | 23 | 3,207 |
14-Jun | How to Choose a PPC Agency | 14 | 2,947 |
12-Jun | Why We Can't Do Keyword Research Like It's 2010 - Whiteboard Friday | 90 | 22,010 |
11-Jun | Eliminate Duplicate Content in Faceted Navigation with Ajax/JSON/JQuery | 38 | 5,753 |
9-Jun | 5 Spreadsheet Tips for Manual Link Audits | 50 | 6,331 |
5-Jun | Should I Use Relative or Absolute URLs? - Whiteboard Friday | 79 | 15,225 |
3-Jun | How to Generate Content Ideas Using Buzzsumo (and APIs) | 50 | 10,486 |
1-Jun | Misuses of 4 Google Analytics Metrics Debunked | 51 | 9,847 |
The 1Metric scores for the half-cadence period (in red) average almost 60, suggesting those posts performed better overall than those during the double-cadence period, which averaged a 1Metric score of 45. We know the traffic was lower during the half-cadence weeks, which suggests engagement must have been significantly higher to result in those scores, and vice-versa for the double-cadence weeks.
Taking a look at our on-page engagement metrics, we see that play out quite clearly:
The number of thumbs up and comments stayed relatively level during the half-cadence period, and fell sharply when there were twice as many posts as usual.
We're incredibly lucky to have such an actively engaged community at Moz. The conversations that regularly happen in the comments—65 of them, on average—are easily one of my favorite parts of our site. We definitely have a "core" subset of our community that regularly takes the time to join in those discussions, and while the right post will tempt a far greater number of people to chime in, you can easily see patterns in the users who spend time in the comments. Those users, of course, only have a limited amount of time.This is reflected in the data. When we published half as many posts, they still had time to comment on every one they wanted, so the number of comments left didn't diminish. Then, when we published twice the number of posts we normally do, they didn't spend twice as much time leaving comments; they were just pickier about which posts they commented on. The number of comments on each post stayed roughly the same.
The same goes for the thumbs.
3. Subscription slowdown
The Moz Blog is available via an email subscription through FeedPress, linked to from a few different places on the site:
We wondered, what would happen to those subscriptions during the half-cadence period?
Hypothesis
With fewer opportunities to impress people with the quality of the blog's content and earn a spot in their inboxes, subscriptions to the blog posts will drop significantly during the half-cadence period.
Results
As it turns out, there was minimal (if any) effect on email subscriptions. Check out the numbers for both periods below:
Here's a view that's a bit easier to digest, similar to the one for traffic in part 1 of this post. This shows daily deviations from the average number of new email subscriptions we get (about 34/day):
On the whole, this is a very uninteresting (and for that reason interesting!) result. Our subscription rate showed no noteworthy fluctuations during either of the two testing periods.
These numbers are based on the total number of subscribers, and with half as many emails going out during the half-cadence period, we can fairly confidently say that (since the total subscriber rate didn't change) we didn't get a decrease in unsubscribes during the half-cadence week, as we'd have seen an increase in the subscription rate. That's a good sign: If people were fatigued by our rate of new emails already, we'd likely see a reduction in that fatigue during the half-cadence weeks, leading to less churn. No such reduction happened, so we're comfortable continuing to send daily emails.
One important note is that we don't send multiple emails each day, so during the double-cadence period we were sending daily digests of multiple posts. (Were we to send more than one each day, we might have expected a significant rise in unsubscribes. That's something HubSpot was better able to track in their version of this experiment.)
4. Community complaints / backlash
This was another primary concern of ours: If we skipped days on the editorial calendar, and didn't publish a new post, would our community cry foul? Would we be failing to meet the expectations we'd developed among our readers?
Hypothesis
Having multiple days with no new post published in a relatively short period of time will lead to disappointment and outcry among the readership, which has grown to expect a new post every day.
Results
While we didn't proactively ask our community if they noticed, we were watching social traffic specifically for word of there not being a blog post on one or more of the days we skipped during the half-cadence period. We figured we'd find a bunch of "hey, what gives?" Our community team is great at monitoring social media for mentions—even those that don't specifically ping us with @Moz—and this is what we found:
A single post.
I guess @Moz is looking into only posting 3 blogs a week. It's the most depressing A/B test I've ever come across.
— Ben Starling (@BeenStarling) June 4, 2015
That's really it. Other than this one tweet—one that elicited a heartfelt "Awww!" from Roger—there wasn't a single peep from anyone. Crickets. This hypothesis couldn't be more busted.
We asked in our most recent reader survey how often people generally read the Moz Blog, and 17% of readers reported that they read it every day.
Even if we assume some statistical variance and that some of those responses were slight exaggerations of the truth (survey data is never squishy, right?), that's still a sizeable number of people who—in theory—should have noticed we weren't publishing as much as we usually do. And yet, only one person had a reaction strong enough that they posted their thoughts in a place we could find them.
5. Trading quantity for quality
This is a far more subjective hypothesis—we can't even measure the results beyond our own opinions—but we found it quite interesting nonetheless.
Hypothesis
If we post fewer times per week, we'll have more time and be better able to focus on the quality of the posts we do publish. If we publish more frequently, the quality of each post will suffer.
Results
As nice an idea as this was, it turned out to be a bit backwards. Publishing fewer posts did leave us with more time, but we didn't end up using it to dive deeper into revisions of other posts or come up with additional feedback for our scheduled authors. The Moz Blog is written largely by authors outside our own company, and even though we had more time we could have used to recommend edits, the authors didn't have any more time than they otherwise would have, and it wouldn't have been fair for us to ask them for it anyway.
What we did do is spend more time on bigger, more innovative projects, and ended the two half-cadence weeks feeling significantly more productive.
We also noticed that part of the stress of an editorial calendar comes from the fact that an artificial schedule exists in the first place. Even with the reduction in volume, we felt significant pressure when a scheduled post wasn't quite where we wanted it to be by the time it was supposed to be finished.
Because we ended up spending our time elsewhere, our experiment didn't focus nearly as much on the comprehensiveness of the posts as the HubSpot experiment did. It ended up just being about volume and maintaining the quality bar for all the posts we published, regardless of their frequency.
Our productivity gains, though, made us begin to think even more carefully about where we were spending our time.
Wrapping up
With some basic data clearly showing us that a day without a blog post isn't the calamity we feared it may be, we've decided it's time to raise the bar.
When a post that's scheduled to be published on our blog just isn't quite where we think it ought to be, we'll no longer rush it through the editing process simply because of an artificial deadline. When a post falls through (that's just the life of an editorial calendar), we'll no longer scramble to find an option that's "good enough" to fill the spot. If we don't have a great replacement, we'll simply take the day off.
It's got us thinking hard about posts that provide truly great value—those 10x pieces of content that Rand mentioned in his Whiteboard Friday. Take a look at the traffic for Dr. Pete's post on title tags since it was published in March of 2014:
See all those tiny bumps of long-tail traffic? The post still consistently sees 3-4,000 uniques every week, and has just crossed over 300,000 all-time. That's somewhere between 60-100x a post we'd call just fine.
60-100x.
Now, there's just no way we can make every post garner that kind of traffic, but we can certainly take steps in that direction. If we published half as many posts, but they all performed more than twice as well, that's a net win for us even despite the fact that the better posts will generally continue bringing traffic for a while to come.
Does this mean you'll see fewer posts from Moz going forward? No. We might skip a day now and then, but rest assured that if we do, it'll just be because we didn't want to ask for your time until we thought we had something that was really worth it. =)
I'd love to hear what you all have to say in the comments, whether about methodology, takeaways, or suggestions for the future.
I think a missing piece of the data here would be sales leads, tris, etc. of Moz. While Moz started out as a community, now it has to be one of the major sources of acquisition and retention for Moz's subscription based service.
Did the leads dip when you slowed down? In the weeks after? Increse when you did more? In the weeks after? The spacing of the test may have made this harder to track given the flywheel nature of things.
My agency used to blog a ton and when I saw increases in traffic, I staffed up to increase posts to 3x a week, more traffic! However, when I looked at new business, there wasn't a change. We made sure the site's funnel's weren't to blame, etc., but eventually pulled back when we realized most of the visitors weren't prospects, but competitors or those who just wanted free advice
but if your trying to build your brand, then people might come to find out the "free advice" several times before they convert. I am not sure on the numbers, but someone said last week at Mozcon the number of times that people come to the site to convert was huge (more than I was expecting). A lot more than what I was expecting. So if you was looking at your data on a last click basis, or over a shorter period of time then they might not convert. But if your looking at the long term strategy could generate you clients in the future
Of course - don't get me wrong, this wasn't a short experiment, we did it for years, not days. The sales cycle for agency business is way longer than Moz for sure, so we gave it plenty of room, but it just never actually drove business.
Great question, Jeff -- in retrospect, I wish I'd talked a bit more about that in the post itself.
We took a look into the free trials that folks signed up for during the experiment, but didn't see any appreciable changes. (You can check out the graph here if you're interested.) We also weren't too keen on reporting that we didn't see any changes, as it's near-impossible to draw accurate insights from such fuzzy data.
Andy touched on why in his reply below -- Rand often cites a statistic that Moz community members touch the site an average of eight times before they decide to give our software a try, and the time it takes for those visits to occur often outlives the cookie in their browser (not to mention the myriad other complicating factors). That's not even to mention actual vesting rates to paying subscribers, which happen 30 days after the free-trial signups.
You're absolutely right that the blog is one of the major sources of acquisition and retention, but while we do track assisted conversions as best we're able, the numbers are so soft that they just wouldn't add much here. Therein lies the most difficult part of content marketing (if you ask me): Scaling qualified traffic.
Awesome, thanks!
Trevor,
Thanks for the digestible insight. Moz's test, not simply guess, philosophy is on full display with this experiment. I really like that the team was willing to take one on the chin if it meant discerning what best served the community.
Maybe most of all, though, I appreciate that the brand is unwilling to be a slave to data. In the end, it's about more than traffic, likes, comments, visits and even conversions.
If you put the needs of the audience first, consistently, those things tends to happen, a fact evinced by how the blog began in the early days by Rand to share information with the community.
Proud teammate,
RS
I can't tell you how much I appreciate the kind words, Ronell, and how much I appreciated your help making everything happen while the experiment was running!
In the end, it's about more than traffic, likes, comments, visits and even conversions.
Yes, yes, yes, and yes again. Those are all signs that things are going well, but shouldn't be the goal. If they were, we'd succumb to clickbait and cuddly kittens like so many other sites have over the years. It's about providing value to the readers, and if we can increase the ratio of the value they get to the time required of them to get it, we're doing our jobs well.
well things are much more tough now for sure. But then it was a great work by Rand to create a community at SEOmoz. I believe moz has actually created a trend where people have now started teaching every secrets in their niche on their websites, which they were reluctant to do before, after seeing the success of Rand's methodology to teach every secrets and knloedge you have in your niche.
Good study, as always!
Since I discovered this site I used to go every day looking for new articles that you have written. I had noticed that it had lowered the number of posts, but thinking that we are supposed to holiday time you'd take lighter.
This has made the find and read older articles. I still have the habit of going daily.
As for complaints, maybe more than one has been surprised to have diminished the post, others like me, they have blamed on vacation (we all have the right to relax a few days per year), others do not have even noticed, but I think only a small minority will be bothered.
I love the research you do, often they surprise me ...
Thanks for the kind words, Tino -- that's one of the things we were actually hoping would happen!
A holistic content audit and lots of thinking about making older content more accessible and browsable are on the horizon for us, and as we were gearing up for the half-cadence part of the experiment, our hope was that folks who wanted something to read would look a little farther into the past.
That's a good point about the difference between noticing and being bothered by the change. I'd wager that far more than the one tweeter we mentioned above noticed the change, and we just didn't hear from them because they didn't mind.
Skip a day? As long as you don't skip leg day, then it should all be fine ;)
Well there's a meme I'd never heard of before. (Not sure I'd fool anyone if I told them I went to the gym regularly enough to have a "leg day.") =)
Trevor, you never failed to provide valuable insight as always.
I'm amazed by all the data and insight you put on this, I can only imagine the time spent and effort you and your team made to come up with this post. Thank you for considering this experiment for us.
In conclusion, Rand was right on the spot about 10x content. I just want to share the three things I learned which will be beneficial for me. The first one is to create only a thought provoking content, second is to publish well edited content only, and third is to publish it on schedule basis to gather more traffic.
The third conclusion really got me thinking big time because I wasn't able to think of this until now. I thought as long as it's quality, people will always find time to read it but you made me realized that is not the case.
Thanks, Mark. Yeah, finding the right time and frequency to publish is a complex process, and one that likely looks a bit different for everyone. We had an interesting Twitter conversation this morning about what we call the point of "saturation," where an audience simply isn't willing to spend more time learning what you have to say... even if you've got a very high-quality piece. That point is in a different place for different audiences, and one of the most difficult tasks content marketers have is figuring out where it is for theirs.
Thanks for sharing the twitter conversation. :)
Great, candid and balanced analysis. Thanks for the report and details! The issue of finding the balance between quantity and quality is something many of us struggle with regularly. As the saying goes, 'just because you can doesn't mean you should.'
I look at nearly all MOZ blog announcements that come through my email, screening for those that seem to fit our needs and my role in our organization. But, I have to admit that I often feel quite overwhelmed with the sheer quantity of MOZ announcements that appear in my email box.
Then again, it's great to have the information, and all I have to do is scan and select. What a luxury.
Your balanced decision, based on statistical analysis rather than emotions, is a great role model in the quantity vs. quality struggle. Keep up the great work. Go MOZ!
Thanks, Robert -- really appreciate the kind words and support. =)
Very interesting. Thank you for sharing this data.
A question that I have is... What data do you use to inform content development or the issuing the call for content contributions? Do you monitor the types of content (by topic or by target audience) to see what is consumed by members? What brings in traffic from search? What triggers conversions? What is evergreen?
My experience in a very different industry is that the most successful content is the topic that is a surprise for the visitor and is totally off my radar screen. But, this can be risky content to spend time or money on, yet it can be awesome when it succeeds.
Interesting experiment. Right now you've got the audience trained to daily posts, so I wouldn't expect your traffic to decrease in just a two week period.
Did you measure bounce rate to the Moz Blog home page? Did that see a significant change when there was no new content for that 17% who self-report that they read daily?
Overall, I'd expect that with only 2 weeks at a new cadence, the readership would not be trained yet as to what the new publishing schedule was. All of the productivity articles talk about needing around a month to train yourself into a new habit, and visiting a web site to see what content is there might follow the same pattern.
I think you're exactly right, Craig. A longer experiment would be really beneficial. I hadn't looked into the bounce rate before, but great idea: Here's a chart of what it looked like.
In short, the half-cadence period saw (on average) a 5.8% increase in bounce rate from normal, and the double-cadence period saw (on average) a 4.2% decrease from normal.
So yeah, I think you're really onto something, and it lines up with good common sense -- many of those daily readers wouldn't have realized what was up, so would have come back every day... adding to the traffic... but would have left right away when they didn't find anything new. I don't think the bounce rate increase would ever equal that 17%, but 5.8% is nothing to shake a stick at.
I'm definitely thinking this won't be the last time we run experiments along these lines. =)
Was that the bounce rate off the site, or the bounce rate to the HOME page? If it's to the HOME page, it may mean that your other content got more views? Was there any impact to time on site? So many questions... and again, in such a short experiment, where the actual content could impact the results, it is not statistically valid to make any grand assumptions, imho... But it could lead to some great new hypothesis...
That was the bounce rate off the site. When it comes to blog posts, I pay very little attention to time on site as a comparative metric. Some posts simply take quite a bit longer to get through, and the only sessions that are counted are those when the user takes another action on the site after loading the page. I might look into avg. #pages/session -- that could yield some interesting results -- but on the whole, I completely agree with you -- there's no real statistical validity to this experiment. It was really an exploratory study to give us some general validation of things for which we'd previously relied on gut feel. When I find a bit more time, I think I'll try and do another study that's longer-term and narrower in focus in hopes of getting some truly reliable data. For now, hypotheses are fun. =)
I guess because you have the YouMoz, you still have quite a few other posts being produced each day, so still content for people to read and consume.
If people assume Youmoz and Mozblog are the same thing, they might not have notice the drop in articles. I probably spend more time reading the YouMoz articles than the more in-depth articles on Mozblog.
But its a great little experiment, we have the same internally calendar, we must post each day - might try running this same experiment one day to see if we get similar results. Would take some pressure off our content team to free them up to create the 10x content that we strive for.
Very true, Andy -- that was the subject of conversation in reply to Rand's original tweet. One thing I'd love to dive into if I have a bit more time is whether there was an increase in sessions that included moz.com/blog, and then had either moz.com/ugc (YouMoz) or moz.com/rand (Rand's blog). That'd underline exactly what you're suggesting -- that folks showed up to read the Moz Blog and saw there was nothing new, so they clicked through to one of the other blogs. I'll be sure and update here if I get the time to look that up.
Awesome to hear you spend a good deal of time on the YouMoz blog -- we've been quite happy with the posts we've been able to publish over there, and it's great to hear you're enjoying it. =)
If you ever do decide to run your own version of this experiment, I'd love to hear about your results (and I'm sure Ginny at HubSpot would, as well). Please let us know if you do!
Great little experiment Trevor! I'm not at all surprised that engagement thinned with an increase in volume. As you noted, a lot of people don't have the time to read blog posts. Additionally, those who do have the time might not have the mental energy to handle more than one long post a day. In fact, I have no doubt that some business owners and marketers are in that group. Of course, long posts aren't a bad thing. I'm just pointing out that some people, like myself, need time to review and think about all of the valuable information made available on this site. Some people won't comment until they have thoroughly considered everything. I believe you might be able to increase engagement by using shorter posts on days when you decide to publish more than one post a day. I also believe that an early alert about the change in post length - a day or two in advance of the release - might also help.
Thanks, Todd. I've played around with ideas like that before -- something of an "up next" info box on the blog, offering the title, short description, and possibly additional info like length before a post is even published. I love the idea of setting expectations in advance -- if it helps people get more of what they need and less of what they don't from our site, saving them a bit of time, it's a win.
Very detailed post Trevor, thanks! Definitely made us feel a bit better about missing post deadlines! I wonder how all of it affects the SERP's though :)
such a lovely post. Interesting data as its shows our community interest and behavior. so many things being consider and a lot for us to understand how the content enagage users.
Great work everyone! When I took the survey, I swapped "Check the site daily" for "Read the site daily" Cause I check 1-5 times per day. :-)
Fantastic! That's what we like to hear. =) Thanks, Lou.
Trevor, great experiment and analysis!
I've been thinking (at my company) about a strategy of publishing less and would love some feedback when you (or anyone else!) have time.
Here's a basic example. One blog post gives advice on doing X. Another blog post gives other advice on doing X. Still another blog post gives even more advice on doing X. Instead of publishing three blog posts, why not create a single guide on doing X and publish it elsewhere on the site as the most-authoritative online content on that topic. Moreover, it will remain evergreen and can be updated and changed as needed.
If the guide is truly the best thing on X on the Internet, then it will surely rank more highly and get more traffic, links, and shares than the three lesser blog posts combined.
However, such a strategy leads to other questions: What, then, should and would still be published on this hypothetical blog itself?
Would love any thoughts when you have time!
Thanks, Sam! Glad you enjoyed it.
I think you're on the right track, but why not release all three things as separate blog posts, then begin the work of combining them (and filling in holes), eventually releasing that guide and 301'ing the previously released posts? It's a technique I picked up from Mack Fogelson, and has a number of benefits:
And more. It's actually what we're attempting with our upcoming Beginner's Guide to Content Marketing -- we've got a chapter on ideation that Isla released last month, and we're planning on putting another chapter or two up before the guide itself is actually released.
In terms of what should still just be published on the blog, then, I'd say everything -- it's just a matter of whether or not you're gearing it up for inclusion in a bigger piece of content down the road. The topics that fall into narrower niches -- those which target smaller segments of a broader audience -- can remain as blog posts.
Great experiment and a fantastic write-up! Thanks for taking the time to share the results so openly. I hope we'll see a follow-up if you all attempt a longer 'half-cadence' study.
You can bet your bottom dollar. =)
Great study!
I've tested publishing frequency over the last six months and found that posting less (from 4 times per month to 2 times per month) increased engagement. It also lead to more time to promote the content, which I used to get more social shares and links to each new blog post.
Fascinating! I was most interested to read the part about quality vs quantity...and was glad to see it's what I expected. More time does not really improve my writing as long as I always strive to write my best, if that makes sense. The takeaway is that quality writers never write crap, lol.
I've never commented, and I apologize for that because I'm a high-volume reader and a huge fan. How could you know that unless I make myself known?
Thanks so much, and I'm so glad you hopped into the discussion here!
There's one more point I didn't talk about above, and it's one that Ginny covered pretty well in the results from HubSpot's version of this experiment. There's obviously a minimum threshold under which we simply won't publish a post, but once it provides sufficient value and intrigue, we're happy to move forward. That doesn't mean, though, that all of our posts provide as much value as they could if we spent more time. It isn't necessarily that posts couldn't be improved; it's just that we see diminishing returns as we put more and more time into them. I'd bet it's sort of the same situation for your writing; am I right about that?
Hi Trevor & the gang,
Thanks for sharing these results! It's very cool that you and the team at Hubspot spent the time to do this and then share the results with the community. You're both trusted brands, and I've learned a lot from you guys.
Would you consider doing a longer test of these hypotheses? The majority of the results you describe as somewhat inconclusive/negligible/fuzzy. Maybe they need more time to develop?
Going on this frequency test alone, it seems that the half-cadence strategy deserves further testing. If nothing else, it would allow more time for the team of writers to tackle other tasks and polish off even better posts.
I think you're exactly right, Emily -- the half-cadence approach (or some variation on that) deserves more testing in the future, as two weeks simply wasn't enough time to see conclusive patterns develop. I thought about trying to use our knowledge of past performance of posts (the shapes of the "traffic tails," if that makes sense) to extrapolate these results and estimate longer-term effects, but decided there was too much guesswork involved. I'd love to see what a couple of months with just one fewer post each week would do, and might just have to set that up. =)
Thanks for the kind words!
Yikes, I was starting to worry as I read through that post.. but settling on occasionally not publishing as per the schedule because you want to get that post just right? I'm good with that :-)
How much time is spent, on average, on your blog posts? Of course the time can vary from posts which a topic expert can whip up in no time and a post which has data, experiments, etc. (like this one), but how much time should be allotted to writing a blog post?
It really does vary dramatically. This one, for example, took me around 20 hours (including the data analysis). A short and sweet post that briefly covers a specific topic might take an author as little as a couple of hours, and a comprehensive guide (like this one that Mike King wrote about personas) could easily take 100+. It all depends on the author, their writing experience, their level of knowledge (and thus how much research they have to do), etc.
EDIT: Apparently Mike King is a blog-writing phenom. Evidence
Thanks for the insight! I guess a follow-up test to this experiment would be to determine if the time spent on a post has anything to do with its success? For Moz, The Search Agency, or any company with topic experts writing posts, it would be great to know who is best suited to write which topics, and how much time should be allotted to post creation. Is there a point of diminishing returns where the more hours spent on a post has less of an impact on its performance?
For Moz, it looks like you are already tracking post performance with your "Post Analytics". If we can take that performance data, go back and ask authors to provide time spent on post creation (and include it as a metric moving forward), we might be able to see some interesting results. For example: "Pete, write about these topics because you can whip them up in a couple of hours with no sweat and they yield more sign-ups, page views, etc." On the other hand: "David, shift focus on writing about these topics because those types of posts are your strong suit, what people respond to the best, and you have an easier time writing about."
I'd be fascinated to see the results of that particular study, but I have to say I'm pretty skeptical that they'd be statistically significant. Pete could write about those topics and do a fantastic job, or he could have a headache one day and not be able to think of a fresh angle, so he spent twice as long on a post that did half as well. There's such an immense number of variables in the writing process that the data would be really squishy. I suppose there's comfort for editors in the job security that affords, though. ;)
Great content .. thanks for sharing!
one of my favorite articles "The Alleged $7.5 Billion Fraud in Online Advertising"
one of the earliest SEO communities in 2004?!? ah that's ok you were probably in elementry school when I started.
And now every time Moz skips a day, we'll be waiting with bated breath for a rockstar post :)
Teed ourselves up pretty well for that, didn't we? =)
We'll certainly give it everything we've got!
Very interesting! Not what I would have expected at all. Keep up the great work guys. And now back to your regularly scheduled programing...
Love the idea and Agree with you on the approach now let's jump into real SEO uncovered such as Semantic, International, Backlink profile authoity... where early days we missed basic info now we're looking for the last mile
Great friday night read, Trevor. Thanks :) That was quite a ballsy experiment where Moz could have lost some user trust or caused confusion among the community. The flywheel part makes a lot of sense, at least for Mozblog. What I'd like to see is a test like this focused on a well-read blog from a lesser known brand and see if those results fluctuate more heavily.
Best,
Bas
No guts, no glory, right? =)
I'm right with you -- I'd love to see more testing along these lines (and probably for longer durations, measuring effects in various ways for longer periods of time) for a number of different kinds of companies. The concept of audience behavior seems so nebulous for so many organizations, and some basic testing along with a combination of guts and common sense would do wonders for the industry.
Great experiment and a fantastic write-up! Thanks Trevor!
Along with content publishing we require content promotion as well. After publishing any content we need to get time for content promotion is very important. So we publish content on our blog 4-5 times a month.
As a wedding photographer here in Charleston, Sc. I am learning so much about how competitive my business is, and how to stay ahead. I have learned so much for reading about blogs like this and my site, how to improve the content of it has well as my blog, and writing a blog with the right amount of content. Trevor I guess "Less is more" ? If you get a chance check out my site thank you. www.kingstreetphotoweddings.com. Thank you MOZ glad to be a part of this amazing company by learning from you all.
My Very Best,
Michel
[link removed by editor]
So glad you're finding value in the content, Michel -- that's our goal in producing it. =)
Less is often more. There's that old quote often misattributed to Mark Twain, "If I had more time, I would have written less." Many an editor lives by that mantra. At the end of the day, that's too simplistic a rule, though. Think of the very heart of why we write, or speak at a conference, for that matter: We have ideas, and we want to take those ideas from our heads and put them in everyone else's heads, with as little lost in translation as possible. The more effective the writing, the better it's able to do that. Less is more when there was already too much, but there comes a point where you've covered what you intended to cover about as succinctly as possible. It's not about less, it's about one of my favorite words, compendious: "containing or presenting the essential facts of something in a comprehensive but concise way." It's a lesson I learned from days in journalism, where column inches come at a premium, and translates beautifully to content marketing.
Absolutely awesome. Also leaves me scratching my head. I'd think the value of "more is better" comes largely from getting more ranking in SERP. Lately I've been looking at the many train-wrecks for websites that have what I call "topic creep," usually a result of not sticking with a content plan. Pushing out more content can mean rushing and can mean being too "desperate" for ideas outside the defined holistic theme of a site or blog. Narrow can be better. That's not an issue here as with so many more vs. less blogs. Would seem the variance is more to do with the new users and new readers who don't have a set time-budget for Moz but visit when they find a share or more importantly a relevant search result.
Hiii, Trevor veryhelpful post , and I love to read 3-4 post every day from Moz if they increase my knowledge store.
Grande Trevor!
Thanks for such detailed post
Cool experiment and I'm sure it would work well for a website like MOZ Blog but for smaller websites that need to establish themselves with Google I don't know that this is the way to go.
That's a really important point -- our blog has a well established readership, and things might be significantly different for other organizations. We had an interesting discussion on Twitter this morning about this very fact.
In short, quantity can help -- but only up to a certain point (what we call "saturation"), and only if the quality is also there. Quality is a prerequisite for quantity. We also believe the point at which an audience reaches that "saturation" varies quite a bit. At the end of the day, the goal is to provide as much value as possible in whatever time your audience is willing to give you. Anything beyond that will show extreme diminishing returns.
10x rule is definitely more effective. But overall its a complex science. I am running an Australian news website and I have done some experiment with quantity and quality regarding content publishing. I tried to publish almost 4 blogs per day but quality suffered in most of them. Rather, making one blog which people will share gets more traffic. But having said that, If I was able to pick up unique topics, creative headings sometimes all those did well because in many cases people on social media these days give a like and share just based on heading and first few paragraphs. Because of the clutter in news feeds, very few posts get clicks anyway and very few users have got time to read all of it.
Even on social media especially twitter, users would just like or re-tweet based on the heading that matches with their niche. Very few have actually opened and read them.
But at the end of the day, quality wins. Because above technique is good for generating temporary traffic but quality blogs with 10x effect is better for long term brand building.
First of all thanks for the informative post. Content is king and I think there is no doubt about it. Getting a good ranking in google is not an impossible task even it is for most competitive keywords. I am getting my keywords in First page only in first 20 days. Its good to keep content user friendly not only for google but also for the websites overall image and company reputation.
It has long had asked me this same question but my traffic volume is very small and stable and that the evidence had not done much . good job! And yes , I prefer publiquéis only when you have content level is everyday or twice a week.
Classical study!! You may skip a post some day in a week but don't ever do this with WBF, it's a request :)
Fear not, Umar -- we won't touch Whiteboard Friday. We skipped one once, and the number of people who asked what was going on ensured we'll be sticking with that routine for as long as we're able. =)
You make a great analysis Trevor, follow on this way because we are very interested in this studios to improve our websites!
it is clear that social networking is what attracts more traffic to your site but only sometimes.