The Cardinal Sins of Landing Pages and A/B Testing
This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.
By now, you probably understand that a good landing page is essential for online businesses. You may have even created a few using services like Unbounce. Maybe you’ve bootstrapped it and created one yourself from a CSS template. Heck, maybe you went all the way and hired a designer to make a whole page for you.
But are you getting everything out of your landing page that you could be?
I’ve seen businesses fail again and again at creating and testing landing pages, both in my in-house experience and as an SEO consultant at WhiteFireSEO. Each one of these sins has bubbled to the surface of an otherwise successful campaign and crushed sales—or kept a page from reaching its full potential, anyway.
So you’ve got a landing page, but are you selling yourself short by indulging in these seemingly innocuous sins?
Landing Page Creation
Distracting from Your Main Purpose
If it isn’t contributing to conversions, it shouldn’t be there. A critical problem with most landing page designs is an excessive amount of “extra” content; get the reader’s attention, state the benefits for them, add a little flair and get them to click. Extraneous features—like social sharing buttons, ads, affiliate links, extra images, etc—have their place on a website, but a landing page is not it.
The last thing you want your potential customer to see on your landing page is an ad from a competitor, and if you’ve got any form of targeted advertising, you’d better believe that an ad for a competing social media consultant is going to show up on your page about social media consulting. Don’t leave shredded cheese around your mousetrap; put it all in a block on top of that pressure-sensitive leg-breaker.
No boy, FOCUS. THIS is where the game happens!
Putting Up Unnecessary Barriers for Conversion
Continuing the (morbid) analogy, no mouse is going to climb up a chair leg, tip over a bowl and remove a wrapper for food when they know they could just scurry off with that tasty stuff that missed the garbage can. Likewise, your potential customers aren’t going to put up long with “Register to add this to your cart!” and “Enter the birthdate of your mother’s third grade crush to proceed!” messages. Make it as easy as possible to generate a lead or make a sale.
You know... maybe I don't want a custom pool noodle after all...
This barrier is most obvious with Facebook apps. Even a 14 year-old girl waiting to take a “What Twilight Character Are You?” quiz is going to stop for a second when the app requests access to every piece of data you could possibly fit into a profile.
Some will argue and say “but the less info you get about them, the less targeted your leads are.” The landing page is not the place to weed out the non-committals; that’s what content, SEO, PPC, and everything-else-you-use-to-get-traffic-to-your-site is for. If you’re getting a whole lot of leads who just drop an email address and don’t convert, then you need to re-evaluate how you’re getting people to your site.
Assuming and Failing to Test
The most important thing I’ve learned from landing pages is that you never know until you test. There are very few “best practices” for landing pages. Even the tips I’ve given above might not apply to your site (though I’m sure you’d be safe to adhere to them anyway). Don’t think that because X Study or myfavoriteseosite.com says that videos increase conversions that it will do the same for you. Test it. Find out for yourself.
I worked with a site that produced videos for a large share of their pages. They launched an A/B testing campaign to find out the effect the videos had on conversions, and do you know what they found? It didn’t really affect conversions. On most pages, the conversion rate stayed about the same when a video was present, even though “conventional wisdom” stated that they should be seeing higher conversions.
On some pages, though, video DID increase conversions. Was there a pattern? No, not really. They shared a few similar attributes, but nothing to draw conclusions from. The takeaway? Test it.
No, Stan, I don't think there's a pattern.
A/B Testing
My favorite A/B testing axiom is likely “If it makes you money, it can probably make you more money.” A/B testing is one of the most powerful ways to milk everything you can get out of your landing pages, but so many people do it wrong that it may not seem worthwhile. These are the biggest offenders I’ve come across:
Changing Multiple Things at Once
Sure, it’d be useful to know that your new template converts better than your old template, but do you know why? Probably not. You could say it was the color scheme that did it, maybe the button placement, maybe your added video, but if you’ve made all of these changes at once, you don’t really know.
Test ALL the buttons!
Test one thing at a time. If you truly want to know what works on a landing page, you need to focus down on a single variable and keep all others constant. That way, if you change the button color and see conversions increase, you can definitively say that the color improved conversions.
Testing on New Content
Don’t A/B test on new content. This is related to the above, but different (and common) enough to merit its own paragraph. If you’ve just released a piece to the web, you won’t be able to say exactly what made it go viral; maybe it was timely, or exceptionally interesting, or useful, or countless other things. You can’t draw any useful conclusions from tests on new content. Use an established page that gets consistent traffic for your A/B tests. Anything else will do nothing but mislead you.
Testing on Different Pages
Using different formats on separate pages is not an A/B test. If you format two blog posts differently and one gets more traffic or more shares, you can’t say it’s because of the format change. It might be, but it also could be that the more successful piece was just written better. It could even just be that two more people decided to click that link than usual. Maybe your page just happened to show up in the right season for them. Who knows? Keep your A/B tests on the same page.
Drawing Conclusions from Insufficient Data
You’ve got to have a healthy sample size to draw any meaningful conclusions. This might be the most heavily abused sin on this list. I won’t get into all of the math behind it, but it has to do with confidence intervals and statistical error and normal distribution and a bunch of stuff you probably don’t want to read right now. At any rate, Gallup polls are conducted in such a way that they result in a 95% confidence interval, meaning that you can be 95% confident that the result they predict—be it from an election, a health study, or anything else—is accurate 95% of the time.
So... we can conclude that 75% of Americans love our ride, right?
What does that mean for you? That means that just because you got thirty visitors to your A/B testing pages and two people clicked on the first while four clicked on the second, it doesn’t mean your second page is better. It might be, but you can’t be sure, because your sample size isn’t large enough. Correct sample sizes can vary, but use this as a guideline: the less data you have, the less confident you should be in the results.
So what do you guys think? Talk to me about your experiences with landing pages and A/B testing.
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.