After nearly 11 weeks of running our landing page competition, we have a winner... Paul Robb of CredibleCopy.org, who put together this winning entry. Congratulations to Paul, who wins the grand prize of $1,000, a year's membership of SEOmoz's premium content, a 3-month trial version of the Offermatica software and more. Our second place winner, Carlos Del Rio (aka inflatemouse), put together a terrific page, too and deserves equal recognition (as you can see below). Our third place winner, who was less than 1/2 a percent away was David Mihm - congratulations to all of you!
We wanted to end the contest nearly 4 weeks ago, but struggled due to a heated, close battle between two of our landing page contestants. Finally, however, it was Jonathan Mendez from OTTO Digital who helped us finish the contest last week. Here's Jon's email to us:
Congrats! Page 9 is your winner. It has a higher overall conversion rate (over 7% lift vs. page 3) and has outperformed page 3 on a daily basis (only 9 of the last 29 days did page 3 perform better).
HOWEVER- You need to look at page 3 and see why it is KICKING ASS (41% lift w/ 99% confidence) getting people to the "choose" step.
Before Jon's email, we were actually debating whether and how we could ever finish the competition, as Pages 3 & 9 seemed to be in a constant battle, though Page 3 would only ever "catch up" or get very close to Page 9 and never surpass it. Below, I've listed the data collected through Offermatica during the testing period:
You can see what Jon means when he talks about how well Page 3 did in getting folks to the "choose" page (where you enter your email or login to your account, then head to Paypal to pay - this page is identical for all the landing pages).
As Mark Barrera cleverly discerned in his YOUmoz post a few weeks ago, the separate pages are available on SEOmoz to view. I've listed them in order as they show in the Offermatica data (as well as showing off the nicknames that we've used internally).
I wanted to follow up on this by asking Jonathan Mendez more about landing page design & testing in general, as well as touch on the specific results for the SEOmoz contest, so I asked him if he'd agree to an interview, and he said yes! Here's the results:
First off, Jon, I'd like to thank you not only for the interview, but for taking the time to navigate us through this difficult process of landing page testing. In fact, that brings me to my first question, which would be, what are the big elements (in your experience) that make constructing these tests challenging?
Thanks Rand. It was a pleasure working with you guys. There are two big challenges in constructing an effective landing page test. One is the test design and the other is the creative differentiation.
By test design I’m referring to selecting how many elements to test. Entire pages, parts of a page, A/B, A/B/C/D or choosing the right multivariate (MVT) array (e.g. 7x2 or 4x3). Most failures I see are when people want to test too many things at one time. After you have decided what you want to test, the amount of traffic and conversion expected over certain periods of time should determine your test design.
The second challenge is creating pages or elements that are as unique as possible from one another. What I’ve seen over the years is that the greater the creative differentiation the more likely visitors will respond to one creative treatment over another. This is important because the net effect is higher confidence levels due to a reduction in margin for error. This allows us to get results faster (or “fail faster”). Some of the best differentiation is testing elements being present or not present at all. In sum, a successful test is usually determined prior to any traffic entering it.
What are some of the less obvious benefits of landing page testing (apart from the obvious increase in conversion rates)? Are there secondary goals and positives that outsiders may not always perceive?
The great part about multivariate testing is that you can learn the factor of influence in percentage that each tested element has on conversion. This provides a tremendous understanding about what drives intent. We’ve had many clients that have taken this learning and applied it elsewhere, namely in their ad creative and messaging. For example, a recent client learned that the eTrust and Verisign seals had a 98% factor of influence for increasing conversion. Not only did they start to use this messaging more prominently in the remainder of their conversion funnel, they started using it in their offline advertising.
The other great benefit is found in segmenting your results. Here we can learn about what is important for different sources and behaviors or even temporally. As you might expect these factors can vary greatly. For example, looking at results segmented by branded keywords, generic keywords and even match type can yield winning ads, pages or elements that are entirely different from one another. This is where we start moving into behavioral targeting and personalization.
How does the Offermatica software operate - and how does it stack up against the competition?
Offermatica is an ASP platform that uses JavaScript tags to change content remotely based on rules. Then, it measures the effects of those changes in real-time. It’s really that simple.
I think Offermatica’s vast leadership position in the market validates it as the best and most versatile tool. Certainly the clients and the success speak for itself. One cool fact is that Offermatica is serving 5 billion impressions a month. The software has continued to develop since my first involvement with it over two years ago. There are new releases every quarter and the tool has moved into targeting and personalization with the release of affinity targeting capability as well moving from on-site optimization quite effectively into ad optimization. This provides holistic optimization capability from impression to conversion. Not to mention, the new UI in beta has an onsite editor that is insanely cool and simple.
Offermatica (and your company, OTTO Digital) were recently acquired - can you talk about that deal? Good thing? Bad thing? Are you staying on board? What changes should we expect to see?
It was a fantastic deal for both Offermatica and Omniture. Omniture gets a great technology that we’ve already integrated through their Genesis program. Offermatica, besides being validated as the market-leading platform, gets the opportunity for continued growth and success through the extensive Omniture client base and global reach. Beside the obvious ability to offer easy A/B and MVT to clients, ultimately the value is leveraging the valuable segment and behavioral data that Omniture captures. With Offermatica that data can now become immediately actionable for marketers for validation, testing, targeting and retargeting.
As far as OTTO Digital, it’s business as usual for now. I’ll be speaking with Omniture leadership in the coming weeks to hear their thoughts on how OTTO Digital can continue to provide measurable results for clients, grow the market for optimization services and remain doing the most interesting and cutting-edge work in digital marketing. Over the last 18 months we’ve created an amazing company; 20 employees with offices in NY, Dallas, LA and SF. We’ve worked for 4 of the top 11 brands in the world and our over 35 clients include the market leaders in most major verticals including search, retail, insurance, entertainment, finance and auto. I’m most proud that we’ve had a 100% success rate. Every one of our clients has experienced measurable increases in performance from our work.
Personally, while A/B and MVT testing still make up a large part of my work, the fastest growing piece of what I do revolves around leveraging the web as a platform to improve results for business by creating applications that are highly dynamic, targeted, personalized and use Offermatica to measure and optimize design, delivery and performance. We’ve had great success at OTTO using structured data through REST APIs in this way. Since the web is quickly moving away from “one-size fits all” static site experiences towards personalization and targeted experiences, understanding how to effectively marry content and creative with technology to provide relevance, and how to quantify that, continues to be what I’m passionate about.
Moving specifically to the SEOmoz landing page contest - it was really, really close at the end, and we ran two of the pages for nearly a month side-by-side to get enough results to have a statistically meaningful winner. Can you talk about why it took so much data to declare a winner when our eventual winner had basically always held the lead?
First of all, congratulations on a successful test. When determining the end of a test we look for two things in the data - stability and confidence. The reason it took so much data to get confidence in the results was because it was so close. In your test there were two temporal factors that we really needed to keep measuring. First, the runner-up page was actually getting many more people into the conversion funnel. This meant that there was a high possibility of latent conversions from this page so we needed more time to take that effect into consideration. Second, was that we wanted to get more daily results. Seeing that the winning page performed equal or better the vast majority of the days tested gave us a higher degree of confidence that the results were stable.
Have you experienced results like the SEOmoz contest in the past, where two pages that were incredibly different had such similar results? What can folks take away from the fact that these two pages were completely unique, yet almost identically effective in converting?
Well, I would challenge that they were almost identical in converting. The winning page had a 7% lift in conversion vs. the runner up. I would regard that as a comfortable margin.
What would you say is the next step in the process for landing page design with SEOmoz? How would you proceed from here?
More testing! You have an ideal situation for doing a Multivariate test following up on the learning from the A/B. I would go back to the fact that the runner-up was getting more people into the funnel but these people converted at a much lower rate than the people that went into the funnel from the winning page. Why? MVT should answer that for you.
Finally, do you have 3-4 top tips that you could share on landing page design and testing that you find to be exceptionally beneficial?
Sure, how about I give you three for design and three for testing:
Design
Simplify everything Reinforce intentions Limit choicesTesting
Let the data end your test, not your calendar Understand results by segment Technology is only as good as the marketer using it
Thanks, Jon. This is greatly appreciated. I really feel that although the direct format and style of the landing page results may not be applicable to all of our readers (though certainly to many), the testing process and the value of testing can't be underestimated. We've gone from around .5% conversion to over 2.5% conversion with this contest - a remarkable change by any standard.
We're still contemplating our next move - which may include hiring some professional landing page designers to see if they can do even better. We're also thinking about creating multiple versions of the landin page based on how visitors reached it (paid search vs. on-site ads / logged-in members vs. new visitors, etc).
As always, I'd love to hear your thoughts on this contest, landing page testing in general and your own experiences with the process.
Hi Fellow SEOMozzers,
Paul Robb here.
Rand caught me a little by surprise announcing this when he did and I have nothing prepared. However, I've just added a post to the home page of my website - www.crediblecopy.org - which tells you a little bit more about myself (for anyone who is interested).
I really don't have an ego about my entry, so please don't pull any punches or censor your opinions. There is definately scope for improvement in the copy and especially in the design. I coded the whole thing in a windows text editor at the 11th hour.
Of course, improvements have to be proven by testing and not mearly public opinion!!!
I look forward to following the discussion here in this thread and thanks to all of those who have so far offered me their congratulations. I sincerely appreciate it.
My thanks to Rand, Scott and Jeff at SEOMoz for all their work behind the scenes. And also my thanks to Otto Digital and Offermatica for overseeing a fair contest.
I'll have more to say on all of this later, and I hope that Rand will let me author a blog post on the main blog talking about the contest, responding to some of the comments and observations from fellow contestants and SEOMozzers, and talking about copywriting (salesmanship in-print) in general... and so on.
That's really up to Rand whether or not he wants to allow me to do that. But I'd really like to, and I will ask nicely!
i think it would really be of value to SEOMozzers to allow me to do so... and the payoff for me is that I would get to sell myself a little in the process, of course!
All the best for now,
Paul Robb
www.crediblecopy.org
I'd like to see that, certainly. Looking forward to it.
Hey Paul, Congrats! I would love to read a little bit more about your experience with this process.
Congratulations Paul!
If you have time, I suggest providing a step-by-step guide on the elements of your landing page. I think everybody here always have a thing with long copy - we thing it doesn't work. Yet time and time again it does.
So, make it easy and useful for us by writing THE GUIDE. Plus: I'm sure it will make a great link building project for your new blog!
Congrats Paul. I too would like to see your post discussing the contest and why you made some of the decisions you made with your page.
Why not write it up and submit to YOUmoz? It'll probably end up here as I imagine enough of us will be interested in reading it.
So basically Rand, what you're saying is that out of the 9 landing pages the only one which didn't feature Will's ugly mug got the highest conversion rate?
Hmm... interesting! ;-)
Oi. They didn't all feature me smiling away... I would prefer to phrase that as 'one with Will giving SEOmoz a testimonial came 2nd'.
Yours was my favorite, but that's because I really like lists that compare two unlike things. :)
I agree with MG -- the checklist was an INSPIRED idea.
I would imagine that the professional landing page firm will test & re-test combinations of the best components from each entry.
Wow. I'm very supprised. When I look at the winning landing page it just makes me think it must be selling some kind of dodgy e-book or something.
It seems to go against all design principles. Then I guess selling is different to designing.
However, it would have been interesting if a really well designed landing page with a strong conversion focus had been submitted as well.
A lot of those landing pages seemed fairly amature to me (not that I could have done any better).
On the flip-side, my personal impression was that the page with the lowest conversion rate (#8) really stood out and had a clear and compelling call to action. Of course, at the end of the day, the point of doing testing is that our personal opinions don't matter to the visitors.
I'm willing to bet if Page #4 was converted to a white background, it's conversions would jump. I've never been able to get a black background page to convert as well as a white (or similar) background page.
Thats an interesting comment - i have never tried full backgrounds in similar pages - have you tried EXACTLY the same pages in different (dark vs light) backgrounds?
One more thing to add on my list to check...
I had worked in the affiliate marketing group for a large travel related industry. I noticed that of the thousands of affiliate websites we had, those which used dark background preformed poorly. Now this could be because they where travel sites and light colors worked better. But as a whole, I noticed that dark colors usually had poor conversions.
Hey Jeremy, that's very interesting to hear! I am the creator of #4 (wondering why the mozzers had a hard time confirming my identity?) & like Paul I put my submission together in the 11th hour.
I've never done any sort of landing page design before & just did this contest for fun, but that surely would have been great to know before I submitted! :) I'll be sure to keep it in mind if I get any future landing page clients.
I've been waiting for the results of the contest since it was first announced and oddly I never noticed this post in my feed reader. Fortunately I found a link to the post elsewhere.
It's interesting that yet again the long sales page leads to the most conversions. I think most of us would look at that kind of page and wonder how it can perform so well. I know when I see one I scroll right to the bottom to look for the price and then leave, yet time and time again we're told they work and apparently they do,
I can understand how not having navigation on the page helps since it leaves you with less options other than convert or leave. I can understand how the page length helps in presenting as much information as possible to convince someone to convert. I would think the copy itself plays a large role in how well the page converts.
Somehow I still have a hard time wrapping my mind around why these pages convert so well. My first instinct is almost always to leave as soon as I see one.
But you can't argue with success.
Holy cow... long copy works? :)
Congratulations to Paul.
I think those who are more tech savvy would shy away from traditional sales letters (and possibly, that's the crowd where the checklist did better). For non tech savvy people who are interested in marketing though, this style is a time tested standard for a reason.
I'd recomment the Ultimate Sales Letter by Dan Kennedy if you're interested in the anatomy of Paul's page. While the book is written with direct mail in mind, a lot of it applies online. Headlines that convert well, the guarantees to include, etc.
I think the benefit to long copy is that you get a chance to cover almost every objection someone could present. Most people skim through those letters, but chances are there's one paragraph or section that really speaks to them. That encourages a more thorough reading which can further sell the product/service, etc.
Wow, congrats to the winner of this contest. I'm a huge fan of the long-form sales letter, but this one surprised me based on just how looooooong it is (a whopping 32 pages when copied into Word!). I agree with some of the earlier comments -- try different approaches for different traffic sources (site referrals, brand phrases (kw includes "seomoz") and generic phrases (e.g. seo). Also keep testing variants (like a trimmed down 25 page version ;-)
I think Paul will need a better server and more bandwidth from now on...
I'm not surprised at all.
I actually saw that landing page as a potential customer before I even knew there was a landing page contest. I was considering signing up and was looking for more information. I don't recall anything specific that piqued my interest other than "You must be among the first 7,000 people to respond."
I almost converted right then. In fact, the only reason I didn't was because I had never heard of the 7,000 limit anywhere else on SEOMOZ, and I began searching for more about that. When I found out about the landing page contest, I realized that it was probably a (very clever) sales tactic.
My personal opinion is that the sudden sense of urgency, along with the risk-free guarantee, was the main ingredient which helped pushed conversions. If I ever have a need for a similar conversion mechanism, I am definitely going to steal this idea. ;)
Perhaps its just me;
- But when I see a long sales letter type of thing the first thing I think of is "MLM, scam, BS..etc"
- I think these work becuase after investing 40 minutes reading it - someone is feeling the need to make that lost 40 minutes worth something ... by following up with a subscription then it works...
- I'm sure the person has the thought -
- I just read this for 40 minutes; I dont see myself reading another 40 minute long speech anywhere else- I'll just buy here...
- Although I will avoid it; I must realize this advertising works :/
Congratulations Paul!
It seems to have been a long hard fight. I suspected that yours would do well but I didn't think that you would beat me (#3) in the end.
Again congratulations.
I also want to thank the staff at Point It! for their input and critiques of my land page drafts.
I just noticed that Johns site "credible copy" isn't up anymore....any ideas on what happened to this copy sensation?
All the links to the landing pages are broken.
What a . . . .
And some of them lead to a signup page for this site's services.
Really what you have here is a 404 turned into a user experience frustration.
Kind of lame, really considering . . .
this whole contest seems eminently valuable, but i notice that the landing pages have been taken down or moved. can we get them put back up?
That could boost your conversion. I would like to see how Page 3 performs on "logged-in members" and "on-site ads". I would guess it will perform better than Page 9. On the other side I would like to see how Page 9 performs on "paid search" and "new visitors".
This has really whet my appetite for a look at the winning and loosing pages - but they are all 404 !
In fact, this whole thread would ostensibly cause me to think very highly of SEOMOZ (if I didn't already) and if I were new, it might even cause me to want to enter the funnel, convert, etc.
Isn't the first lesson in getting the conversion should be "all links should work"... right?
Still love ya, hope the links to the winning pages get put back up soon!
I agree with adam-_- that 9 has a distinct style you either love...or hate. And I suspect that the data could reflect the cognitive styles of the visitors.
Though the pages appear to do almost equally well, I'd be willing to bet that the people they sign up are very different and could well turn out to be very different types of customers in terms of how they'd used their memberships. The trick then, would be to figure out which type of customer you want.
FailMoz.
agreed! let's archive these puppies.
I think it's really interesting that #9 did so well, because to me it reads kind of cheesy, and I'd think it might even turn off people who already have some degree of SEO knowledge. Also the style of writing seems such a contrast to the main "voice" of the site which I think really resonates with readers. But the numbers don't lie, so congratulations to the winner.
Another metric you should check is the revenue generated from each landing page. It could be that the losing page had more people sign up for a year whereas the winning page had more monthly people. You would also want to follow the metric throughout the year as it is theoretically possible that the monthly folks will renew and renew and end up staying members longer than the yearly folks.
I've heard a number of times that the long sales letter types convert higher, but have yet to see any real numbers.
1 thing to consider in these test results is the conversion process to the "Choose" page. For the #3 example there is a disconnect from the layout. The "Choose" matches more the layout of the #9 layout, and this could help with the ultimate number of conversions.
I think it would be interesting to continue these 2 tests only give them access to the "Choose" page, and match them to the landing page. Then track the conversions.
So, you ran all of them for a few weeks, and then just #3 and #9 had a run-off? Is that why they have such high visitor counts?
From a statistical standpoint, it honestly doesn't strike me that the ones with ~250 visitors had enough data (visits or conversions) to produce reliable results. Obviously, this is extremely difficult with 10 variations, but I'm curious to hear other opinions on this. I've been trying to get a better grasp on how testing in the SEM and usability worlds compare (as I'm much more acquainted with the latter).
Dr. Pete - we basically used standard A/B testing protocols, dropping those pages whose conversion rates clearly were not going to be competitive in the top. We actually went from 10 to 5 to 3 then 2, and the 2 battled it out for the last 3 weeks so we'd have enough data to be confident about the winner. Jon from OTTO advised us on these best practices and he's done this professionally for the last several years :)
Sorry, Big R.; no assault on your or Jon's credibility intended. I was just curious. I've been trying to sort out how the more practical approach to testing (that I'm pursuing with more and more clients) compares to the more rigid (sometimes too rigid) approach I had to follow as a grad. student in experimental psych. There's a happy medium out there somewhere.
Have you tried using any stat calculators? Here is a calculator to measure the statistical significance of your results:
https://www.teasley.net/statcalc.xls
Once you have plugged in all your numbers it is best to find around a 95% confidence level. Once you reach this you can move onto your next tests.
* I did plug in a few of these results and there isn't a confidence level at all. So technically these tests should continue to run.
Unfortunately hitting that level of confidence would require a very large amount of traffic. Approximately 190,000 total visits split evenly.
It took 11 weeks to get 9400 visits, at that rate the test reaches significance after 4 years. I don't think we should wait that long.
If there is a site out there that is putting up that kind of visitor count in a reasonable time e-mail me. I would love to put together a more intense test.
Unfortunately, I'm facing similar problems. As much as I believe in testing, waiting 2-3 months to find out that something made no difference at all (or that the original version was better) isn't much fun or particularly productive. It's especially bad when your client is facing a market downturn: we want to test improvements to combat the downturn, but, at the same time, testing is only harder now because their traffic and conversion rates are down.
Well you may be facing a situation where you have to take the risk on a very large change, one that makes enough difference that it is obvious if it is different.
Yeah, I probably tend to be overcautious in testing. I like to know what changes actually made a difference on a microscopic level (for future reference), which tends to lead to an approach that's a bit too incremental.
I'm curious lately, too, how often there's a baseline effect when you're testing an old version vs. a new version. I'm amazed how often website visitors react negatively to change, and I think I'm seeing some of that in my split testing.
Umm...you don't have to run 190,000 visits to find real winner. I use the calculator every day and always have new test to run, with a 95% confidence level. If it is close yes, it may take some time. But if your only comparing 2-3 landing pages it shouldn't take that long. Your numbers are not going to be that close starting out. Once you work on your campaign and do multiple test, that is where is starts to get competitive and can drag out for a while.
In normal situations you shouldn't be trying to test out 10 different landing pages, rather 2 or 3 until you find a winner then move on from there.
Paul is a genius! (not to mention a great copy writer)
Har Har. :)
Looks like "old fashioned" advertising techniques still work!
The headline and tone that "Rand" takes on when he begins with "Dear Friend" were not quite what I expected, that is most often the case with direct response copywriting
I think this was an excellent exercise!!! In the past I have run various tests on PPC campaigns to see which landing pages do better, but most had very simple variations - colour, images, text format and copy.
But to see 2 significantly different types of landing pages perform so close was an eye-opener. I am quite keen to try a similar exercise.
I was surprised by the winning entry - as I have come across quite a few of these in the past and discarded - looks like I am missing the boat somewhere! It seems that the behavioural and consumer psyche plays a big part in the decision to convert - and the winning entry had enough well written text to keep you interested. As an SEm professional, I would have gone for page 2, but as a lay individual, I guess no. 2 has a pulling factor.
Stands to reason that some of the tried and tested strategies still carry water.
A big congrats to Paul Robb for taking the risk of trying and succeeding in using a tradionalist landing page!!!
Some major difference between the top two is that Rob's strongly targets new users and my design focuses on people who already have strong brand recognition, or are turned off by long copy.
Congratulations to the winner, and to the biggest winner SEOmoz (4 times more subscribers)!
I learned something really interesting Today. Our landing page #8, while it was a favorite among members commenting on Mark Barrera's thread, ended up last :-(
I think I will have to start loving all those long sales letters I see everywhere. They definitely work! :-)
very interesting results - i think this is a masterclass in many aspects of SEO web marketing, from the very idea of making a competition out of improving SEOmoz's conversion rate to the writing and layout of the winning design itself and the analytics used to determine the winner...lots of juicy info here!
I find it particularly interesting that a page featuring such a humourous writing style above the fold actually went on to be the number one page for getting people to part with their cash...
It seems you can use humour to make money. Who'd have thunk it!
Humor to part with cash.. hmm.. Beer advertisers? That and an addiction to hooch.
Nicely done. Long copy sales letters FTW :)
Yeah, I've always heard that the Long copy letter is the leader at conversions, but I've never seen a test proving it... (queue dramatic music) until now. It still blows my mind, but I guess it builds trust and taps into what people spend the most time doing on the internet: reading/viewing content.
Well done to everyone involved. I have never been a fan of the "long scroll" landing page but I have to give credit where credit is due.
My gut says a big part of landing page optimization is knowing your audience. A lot of the replies here have commented on the fact that a super long landing page yells "scammer" to them.
When selling a product aimed solely at expert online marketers, a "super long sales letter" landing page would likely therefore be ineffective, as it comes off as unprofessional.
Show the same webpage to an "inexperienced" web user on the other hand (who doesn't regularly find these full page long sales letters on the 'net), and it will likely be significantly more effective...
Congrats Paul Robb! 0.5% to 2.5% is a pretty big conversion jump.
This area of web development fascinates me.
I'd love to see some stats on which parts of the page people look at on landing pages such as the winner.
I see that style of landing page all over but I dont get why it works yet.
I can understand why it would get people into the funnel yet I'm not sure why they would convert so highly.
I guess I'll have to do some research and testing of my own.
I'm surprised #6 did so poorly. Actually, it was the worst! I thought it would be the best... Shows what I know B)
Just submitted to Digg as I think the results are extremely valuable.
https://digg.com/tech_news/So_You_Think_Long_Sales_Letters_Are_As_Dead_As_A_Dodo
As noted I think source of traffic is fairly important as to which one converted better, and maybe it would have been best to chuck some $$$ into PPC on a seperate test with the top
I think page 9 would certainly beat 3 for cold traffic by a much higher margin, which might suggest that 3 would convert better for much warmer traffic.
This is excellent information that shows the dramatic difference a top converting page can make. It really shows why sites should always be testing. I would have never guessed a page in the format of entry #9 would outperform the other entries, but I guess there is a reason why you see these types of pages around.
Also, this contest was not a bad way to get someone else to increase your own conversion rates! ;)
Great Job!
Exelent detail
Now I have a weeks worth of study ahead of me.
Congratulations Paul.
Congrats to the winner.
But I would like to ask if some LP were deleted deleted from the contest? My LP submission doesn't appear here... I know it was kinda crap, but anyway, I thought you give every submission a shot.
:(
Yes - as we noted in our contest guidelines, we narrowed the field before running the contest, choosing the 11 pages that were accurate, high quality and implementable - we also had to narrow it down because, as you can see, it takes a long time to run a test even with this many entrants, and we wanted to have relatively fast results (though we ended up going a full 3 weeks over time).
Oki doki. Thanx.
Congratulations to the winners, excellent job done!
I'm anxious to see if there will there be additional information released on how the pages performed in stages...
It would be interesting to see how the pages were performing at the points where pages were deleted to see how things were matching up. You can see from the xChoose category, the two of the top three performers did not include the pricing information on the landing page, which in my view skews the choose category numbers immensly. #3 is a perfect example of this as it has the highest Choose %. Is a click at that point any more qualified than someone clicking on the ad to get to the page?
Also of note, I see the winning entry was the only to offer a 40 day money back guarantee, which was recommended as a way to boost conversion rates in the copyblogger post that was included in the contest, but it was not given as a resource for the contest.
Even if #9 was the only one to shoot of the "Rand, can you offer a money back guarantee?" email it should have been made public for the contest as that is a pretty massive selling point that was left out.
Again, congrats to the winners, thanks to SEOMoz for the chance, and hope to see some in-depth case studies in the future!
In the meantime, I'll take pride in having the absolute WORST choose % with the best choose to conversion ratio!
I didn't ask Rand's permission to do anything. I assumed the sale.
You got me, I assumed that it would be proper to ask permission before making an offer like that. I apologize.