Over the past few years, our industry has changed dramatically. We have seen some of the biggest removals of spam from the search results, and a growing number of people are starting to focus on building quality links rather than just building links. Companies are starting to really invest in content, and sites are building better category pages and are improving their product descriptions. Content marketing is now a "thing." A big thing.
However, while all these changes are great, it seems as if we have stopped testing in order to adopt new ideas. While I know there are many exceptions to this generalization, I see the trend too often. Most SEOs work off of best practices, and while this is good, who can argue with having good page titles, headlines, copy, having crawlable paths to content, and building good links? We need to continue to refine these portions for the best results.
A great example of this sort of refinement is ranking factors research. A few years back, SEOmoz did some testing around H1’s vs. H2’s and said that the H1 doesn’t provide an added benefit. Whether or not you agree with this idea, this example shows how factors can (potentially) change over time.
Over the last few years, Google has rolled updates that have had significant impact on search: the canonical tag, href lang, and rich snippets/support for schema, just to name a few. While there have been tests on these updates, we need to continue to test and keep our knowledge current. Google is continually testing new things, and we need to rely on testing to keep up. For example, back when Search Quality Updates were a thing, Google would "share" names and descriptions of updates and tests to the search engine algorithm. Frequently, there were 30-40 updates a month that they were rolling out or testing.
As you already know, this means there is huge potential for a high number of changes to the algorithm. We need to be testing (new things and old) to make sure we’re staying current.
Share your results
In addition to all the updates we are aware of, there is a lot that Google isn’t telling us. This is what makes testing and sharing even more important. Barry Schwartz pointed out on Search Engine Round Table that Google left some important items out of their August/September update. Further, there are updates that Google will deny. If it weren’t for people carefully watching and analyzing the SERPs and then sharing their tools (like Dr. Pete’s MozCast), we would probably be largely unaware of much activity.
If we don’t share our observations after testing, we face two problems. First, we can’t confirm and verify what we see (and believe), and second, we can’t move our industry forward. While the SEO industry is evolving and SEO is gaining more widespread acceptance, it is still seen by many as a mystery and a dark art. By sharing our tests and results, we educate the industry as a whole and raise not only the bar, along with our collective reputation. If we can retire bad practices and tactics that are of low-value, we bring more credibility to the industry.
Share your failures
We all want to conduct awesome, break through tests; it’s really exciting to learn new stuff. However, we have a tendency to only share our successes, rather than our failures. No one really wants to share failure, and it's natural to want to "save face" when your test doesn't go according to plan. But the fact remains that if there is a test that "fails," it isn’t a failure.
There is so much we can learn from a test that doesn’t go as expected (and sometimes we don’t know what will happen). Further, sharing the "failed" results can lead to more ideas. Last week, I posted about 302’s passing link equity. I began this test because my first test failed. I was trying to see if a page that was 302’d to another page would retain its rankings. It didn’t work, and the page I was testing dropped out of the SERPs, but it was replaced with the page on the receiving end of the redirect. This result led me to test them compared to 301s. On top of that, there was a really good comment from Kane Jamison about further tests to run to gain a better understanding. If I hadn't shared my "failed" results, I would have never learned from my mistakes and gained knowledge where I least expected it.
Below are a few other tests I've run over the years that ended up with "failed" results. I hope you can learn as much from them as I did.
Keyword research with Adwords
For this test, I needed to provide a comparison of the head vs. long term search volume related to tires. I had heard, at one point, that you could use Adwords impression data for keyword research. I decided to give it a try. I whipped up a rock solid domain and set up a broad match Adwords campaign.
(People even signed up!)
It didn’t work. While we got a lot of impressions, we couldn’t access the data. There was a category called “Other Search Terms” that contained all the impression data we wanted.
Lesson learned: Adwords impression data isn’t great for keyword discovery, at least in the capacity that we tried to use it.
Keywords in H2 tags
A few years back, I wanted to see if there was any advantage to placing an H2 tag around keywords in the content. The keywords were styled to look the same as the normal text; the only difference was the H2 tag. I rolled this out on about 10,000 pages and watched the results for a few months.
What did I find? Nothing. Exactly the same as the control group. Still, lesson learned.
Link title element
This failed test is actually one of Paddy Moogan's. He wanted to test the link title element to see if that passed any value. He set the title to ‘k34343fkadljn3lj’ and then checked to see if the site improved its ranking for that term.
There was no improvement.
Later, he found out that Craig’s site was actually down, so it probably wouldn't be ranking regardless of how it was linked to. This brings up a really important point in testing: double check everything, even the small points. It can be really frustrating to run a test and then realize it was all for nothing.
Your "failed" tests
We've all been there, so it's time to share your story. What have you recently tested that didn't turn out exactly how you planned? If we can all learn from the mistakes of others, we're in a better place. Drop a line in the comments and let us all know!
Hi Geoff,
I really want to point out how important it is for "us" that the "experts" do share their knowledge and testings. And I have to confess the Americans (and British, too) are especially positive in those belongings :-).
I did learn so much the last years through hundreds of blog posts that I can't count them any more.
I want to share what we experienced lately regarding subdomains (well it's not really a failure but an interesting matter of fact.
For several reasons we moved our website from www to a subdomain.
The www version was moved forward via 301 to the subdomain.
Of course first we lost our rankings but slowly we recoverd with the subdomain.
Not to the same positions we had before. Since after approx. some weeks, on brand search, the main result is written again with the www version (not the subdomain!!!) and the sitelinks do of course lead to the pages with the subdomain.
That seems to be an interesting example how Google is treating subdomains.
Here is the link to the search result.
Cool, thanks for sharing this. It's always interesting to see when Google doesn't listen to directives like 301's. This is really interesting!
Yes, I think that testing and sharing can be very helpful. Collectively, those who participate will be leaps and bounds above the rest. Oh, and yes, it does seem that Google is often reluctant to share. It's their trade secret, so I don't blame them.
Great post Geoff - I've always been a fan of sharing knowledge and that's why I think SEOmoz has bought us together, as well as the 100s of Company or Personal blogs packed with information on SEO.Internet Marketing is 1 of the few industries where you don't need any formal qualifications to get in to nor do you need to go to school/college to learn methods and the business behind it -School can't teach you SEO but the Moz community certainly can :)
Thanks, Geoff, for sharing these test cases. In an earlier life I was a research engineer among other researchers. In an ideal situation, it is just as valuable to deliberately demonstrate failure as to prove success. In fact, much of statistically rigorous research starts with a formal hypothesis to the contrary of what may be hoped will be proved. I want to study your post again.
Good reminder to share successes and failures. It would be interesting to compile some "SEO myths and facts," a sort of SEO mythbusters library. There is no shortage of SEO advice out there, but it's nice to have facts to back it up.
For example, a colleague once told me that Google cares about the length of domain registration, i.e. if you only renew your domain for one year then Google thinks you're a fly-by-night operation. As I result I started doing 5 year renewals on all of my domains. But then I read somewhere on SEOmoz (I think an article included a quote directly from Matt Cutts) that Google doesn't care about domain registration length. It can be frustrating to change your strategies based on wrong information and find out later that you were following bad advice. Maybe SEOmoz should start a wiki for SEO hypotheses and tests.
Very interesting and useful post! Geaff, thank you for sharing your achievements and fails. I wonder if there are so many webmasters like you... Cheers!
Great point about failure - it was one of my frustrations with academia. To publish, you need successes, but there's so much knowledge wrapped up in the failures that we miss. Plus, the drive to only succeed leads people to take small risks and do incremental work. I think it can be a recipe for mediocrity.
Thanks Dr Pete, and thanks for doing those crazy what if tests, like cananicalling your all your pages to the homepage :)
Awesome post Geoff, I've always been a huge fan of sharing information - As a blogger for my companies blog I love to share as much information as possible, getting that boost of interaction on the content just makes me want to hug Roger!
Hi Geoff thanks for the post, it is really good for our seo health
The two main basic principles Google used in searchable such as the "most linked pages" and "choosing the right page according to the anchor tag associated with hyper links pointing to a page" which were the basic principle changed (not completely changed) by Google when many theories started exploiting the above two principles.
Now the search quality team is compelled to retrieve it to change their algorithm to match user interests and structure of the document and its relationships to the user queries. May more importance are given to user interests.
Thanks Geoff. I 100% agree with more testing and opening up those results to a wider audience. The industry has gotten more competitive over the last few years and thus more of a competitive edge exists in keeping that knowledge in-house but it also doesnt allow for peer review and verification. I think there is a balance that can be met though that still allows the researcher to benefit strongly while still allowing others to benefit and expand on that work.
I put // in the htaccess next thing I know the whole site was down :/
Apparently the 2 // told all the rules to be ignored. Luckily I did a copy of the file before hand......... Oh wait!
Always good to back up before you do a test or something new!
It wasn't something new I guess I did it by accident.... But yes the learning curve of backing everything up was a nice one to learn early on
I couldn't agree more. Testing is something that's severely lacking in today's market. Not only is there little time because we have to constantly maintain our rankings, but I think a lot of companies are nervous for testing new techniques as the slightest error can send rankings plummeting.
Hi Geoff,
Indeed, sharing failures is something that cannot be omitted – some do not realize how equally significant these failure results imply. In determining the right concepts and methodologies, direct results and conclusions are not always the case --- it is deduced from every other possibilities.
Really interesting - thanks!
Really Impressive Article.
Is this whole post just one larger test? Trippy.
I'll update you later ;)
Congratulations for encourage the members of the community to share their good and bad experiences. Sometimes a simple observation wake to new thoughts/testing
Great post Geoff! I really love SEOmoz - I have been frustrated with the lack of increase in page rank on a bunch of my pages. I have been building quality links on dofollow blogs and have been writing quality content. I know a lot of people say page rank doesn't matter - but it sure feels good LOL Or at least, it gives you a feeling of being on the right track. I guess I shouldn't complain too much - still on page one with most of my keywords and in some sites, even at the top. So my clients are happy, but I am confused as always!
Great topic, Geoff. I'll share one. I thought Google Authorship was an important ranking factor and even wrote a YouMoz post on it. After doing a more thorough and proper study, it appears that Authorship was not as much of a factor as I thought. Sure, it enhances a result, but as of the time I did this experiment, authorship alone did not appear to be a ranking factor.
Sometime Testing and sharing works Awesome. Great Post & Nice Sharing..:)
Awesome post! We really need to start working on our industry, so many spam sites with crappy backlinks on them and the so called "SEO Cowboys" nowadays.
Yes testing and sharing can do wonders. By testing we learn and by sharing we encourage others to test.great post.
Definitely something people (myself included) need to be remind: always be testing. :) Best practices are a great place to start, but there's a great sense of accomplishment in testing and "proving" something yourself.
You bring up a good "d'oh!" point at the end - check what confounding variables there may be. In my first year as an SEO I may or may not have tried to run a test on some pages blocked by robots.txt. That's purely hypothetical though. ;)
It's easy to make mistakes like that though, and once I've done them, I check every time that I don't do it again.
one time i tried to wash my girlfriend at the time's clothes, and i shrunk them.
one time i was up to bat in little league, with a man on third with us down one run, and i struck out. (people had their rally caps on and everything..)
oh, you mean online marketing.. i've been peeping SERs related to Agate, and we conjecture, due to his cartoon avatar, there is some issues with his authorship showing up though i've checked author verification in G's webmaster tools...
I've heard about people having issues with their G+ account if it's not an actual photo, so I could see this being a problem.
Geoff,
I think you are sharing a valuable perspective. I'd be interested to read more about the parameters that you think make up a good test. What is your testing period, etc. I know it varies from case to case, but there have to be some guidelines, right?
Thanks Zeph. Like you said, it depends so much on the details of the test. If you're just watching rankings, I'd like to have results for a couple weeks to a month. If it's a site architecture test and you have a big site, it could take months. Another big impact is how many pages you test it on - the more the better :) I know these aren't real helpful, but I believe there is a DistilledU module coming out on testing soon, not sure if you're part of that.
Hey Geoff , totally agree with you, sharing failures or success would help us move forward.
Great post, Geoff. I totally agree with this. I recently blogged a rel="author" case study where I talked about how I'd accidentally screwed up, putting multiple authorship signals on one page. Even though I was basically saying "hey, I messed up!", which could've looked embarrassing, I was hoping that my experiences would help others.
The fact of the matter is: we (as an industry) need more case studies, whether successful or failed. "Failed" doesn't have to mean "bad," as you can learn a lot from someone's errors. I personally have a ton of admiration for people who share their mistakes and failures, especially if it's for the use and benefit of others.
Thanks and I agree, we need more case studies. They are super helpful, for showing (convincing) clients that they should be doing something, such as consolidating sub domains. What's the link to the post you wrote about rel authors?
Hey Geoff, it's this post: Confusing Google: A rel="author" Case Study
More than just clients, I'd go as far as to say that case studies can help peers (or even others in other industries), who might link to/share the case study, raising your profile that way, too.
Cool test. I would guess they're just not trusting the data when you have conflicting signals.
"Other Search Terms" report in adwords only gives you data for queries that resulted in clicks. Sometimes google messes up though and sneaks in a few queries that did not result in clicks :)
to be fair, everyone shares only so much. seo/sem isn't exactly a non-profit endeavor. to the contrary, it's a hugely competitive, $$$ billion industry. i certainly utilize a number of proprietary techniques, as i expect do many others.
one simple example: everyone knows backlinks are important - yet within the backlink-building process there are many variables & permutations with varying degrees of importance - do we freely disclose all of our findings & strategies - successful or otherwise? not if we want to maintain and improve our competitive edge - as i think it's reasonable to say we all do.
Great post Geoff!
I actually saw that the hardest thing with making mistakes is admitting them.
Take a look at this slideshow by Rand. On your road to success, It's certain that you will make some mistakes. Just don't give up, learn from mistakes and correct them.
And finally, thanks for sharing results of these tests!
Adwords impression data is quite irregular with some keywords and many a times it spoils the quality of our posts.
Nice post Geoff. Your post remind me of a quote "If you've never failed, you've never lived".
Few months back we did a content experiment in Google analytics for our website and we forget to add the canonical tag in the variation page (silly mistake) and later we found that our variation page (duplicate) got indexed in Google and started appearing in results.
Immediately we implemented the canonical tag and after the test got completed we redirected the variation page permanently (301) to original page.
Great post Geoff! The best improvements for our work tend to come from trying to repair failures. Thanks for sharing all this information with us!
Nice insights Geoff! I agree, marketers need to definitely get used to testing and optimizing past and present campaigns. That's the best way to get results from marketing efforts, it's just tough since the world of marketing is so fast paced. I've actually even included this point about testing in my latest article published just now, "7 Marketing Tips from March Madness 2013".
I love the idea about sharing results that we find with Google and SEO, but don't know how many companies would actually share this knowledge publicly. I'd love to post content like that, but since our competitors are some of our biggest fans, we wouldn't want to give away all of our strategies.
Thanks!Melissa
Geoff,
I'm absolutely with you on "keep testing till you find success or failure" & pretty much honest that I never tested anything yet. I want to but don't have resource for that. Also, if you are working in a company, there you can't risk testing on your clients site & not on your company's website because you've don't want to risk your own bread & butter.
We are very lucky to have peers like you & dr.pete & all others who are doing SEO test to check the impact of tiny substance in website.
Thanks.