Skip to content
Search engines 5511dd3

The Trials and Tribulations of Conversion Rate Optimization

Steven Macdonald

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Table of Contents

Steven Macdonald

The Trials and Tribulations of Conversion Rate Optimization

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Conversion rate optimization is now the most important priority to digital marketers along with content marketing. The demand for conversion rate experts has never been higher and conversion rate optimization is now a common fixture within the SEO’s job description. What, we didn't have enough to do as it is?

A new era in online marketing

Online marketing is no longer about traffic numbers and visits - It's about making the website convert and increase revenues. According to the 2012 Conversion Optimization report A/B testing, copy writing, surveys and usability testing are amongst the most popular techniques being used to increase conversion with A/B testing being the most used method for two years running.

Some of the best research to be published on conversion rate optimization has come from Marketing Charts, which over the last twelve months has found that:

  • 98% of marketers see A/B testing as a valuable method for conversion rate optimization
  • 87% of marketers need to improve how they A/B test
  • Only 13% of marketers are happy with their A/B testing approach

The biggest challenges that marketers face in conversion rate optimization are lack of budget and complexity, according to a 2009 report taken at the eMetrics Marketing Optimization Summit.

But a lot has changed since 2009. A/B testing is now THE buzzword in online marketing and testing tools are a staple in every marketer’s arsenal, just as much as web analytics and email marketing software. Success stories appear to be popping up everywhere. The truth is A/B testing is a lot harder than you think. Like me, you only see the case studies of huge improvements to conversion rate and sales. In reality, it's smaller wins that add up over time.

A/B testing is not just about randomly picking out a page on your website and testing it. As marketers, we should be selective and choose “high value/ high traffic” pages such as site registration, contact, and checkout. Even the slightest increase in conversion will get us more customers. It sounds simple, right?

Unfortunately, not all A/B tests end with smiles. I’ve read the success stories, but I’ve also experienced the downside of A/B testing – The tests that have resulted in a loss of thousands of dollars. Here’s a look at two recent tests I’ve ran that didn’t perform well.

Case study #1: Contact us page

At Softmedia, we wanted to increase the number of leads generated from the website. The easiest way was to improve the contact page by removing fields that were not required as well as implementing inline validation. We saw conversion rate increase immediately but wanted to take it further by adding trust signals.

We added a (non-clickable) banner to the right of the form that included brands that are clients of Softmedia. The variation was exactly the same as the control except for the banner to the right (shown below):

Before I discuss the results, I know what you are thinking – Why didn’t we remove CAPTCHA? Well, we kept that in place to stop the number of spam emails we received (which doubled or tripled our email volume) without it.

A lot of brands use customer logos for promotion and added trust and credibility, so adding it to the form seemed like it would boost conversion rates for the better - A best practice. Having run the A/B test using Optimizely for seven days, we were surprised to find the following results:

Conversion rate:

  • Original (without brand logos): 9.4%
  • Variation (with brand logos): 2.2%

The variation, which includes the logos of brands that are clients of Softmedia saw a 76% decrease in conversion rate. A/B testing is usually about increased leads and conversions but this meant that the number of leads that were not generated by running this A/B test was approximately 50 leads, which in terms of contract value is worth approx. $3000-4000 - not a loss that an SMB can take lightly.

Case study #2: Checkout page

At nameOn, we used Google Analytics data to find that 31.7% of visitors abandoned the checkout page – But why? Why would they leave the site after all their hard work of browsing the website and then adding items into their cart?

When reviewing the page, we saw that there were nine call-to-actions on the checkout page including “Sign up for newsletter”, “Like us on Facebook” and “Go to home page”. The only buttons we really need are “Remove” and “Go to payment”. The new page looked like clean (shown below):

We removed the distractions to make it easier for visitors to complete their purchase. It’s an easy enough A/B test to improve the bottom line. However, again, the results were surprising:

By running this A/B test; conversion decreased by 17% which meant more visitors abandoned the shopping cart and less people completed their order. In terms of monetary value, this test lost us approx. $8-10,000 in revenue - Yikes!

Two A/B tests and two sites that saw a loss in sales. This is not something I was proud of, and the idea of launching another test was a little daunting. I had to evaluate what went wrong and what I had learned. This lead me to running new A/B tests, which proved successful.

Lessons learned from A/B testing

1. Don’t end A/B tests too early

Although the two case studies above were pulled after seven days, I don’t recommend pausing A/B tests too early unless you see a dramatic loss in revenue. One A/B test I ran earlier this year saw an increase in conversion of 197%.

However, had I pulled the test too early the end result would have been different.

But by allowing the test to reach a significant amount of visits and conversions, the test results became more accurate and both variations outperformed the control. Experts recommend that an A/B test run for at least two weeks and have at least 100 conversions.

2. Don’t use your gut – Use data

Before you perform an A/B test, make sure that the decision to test a page is based on real data and not just based on internal discussions or gut feelings. Each successful test I have performed has been based on real data. An A/B test is about making your website convert better not which color the product teams prefer on their webpage (true story!).

3. Test pages with volume traffic

Launching an A/B testing schedule and watching the tests perform well can be addictive, but your tests should be performed on high volume traffic pages. One of my first A/B tests was performed on a page that saw less than 100 visits per month – Which meant I was hitting refresh and checking for new conversions 4-5 times per day.

A/B testing is not always the answer to your conversion rate problems. How do you A/B test site speed? Can you A/B test a bigger product selection? Or what about testing the long B2B buying process that takes anywhere between 3-6 months? Maybe you can, but these changes are a lot bigger than simply adding a URL into test software and clicking publish.

How to increase conversion rates the right way

Whatever best practice you have heard, there really isn’t any one size fits all approach to Conversion Rate Optimization – The case studies above prove it. Each post you read about A/B testing or conversion rates ends with "test, test" test" and for good reason - What may work for me, might not work for you.

If you really want to increase conversion rate you need feedback from your customers, which can be in the form of surveys, questionnaires or heat maps. Both Qualaroo and 4Q (now iPerceptions) are great for collecting customer feedback on site and at $9 per month, CrazyEgg is a fantastic tool.

Ask your customers how you can improve

Using Qualaroo, we ask questions to web visitors such as:

  • What information is missing from this page?
  • What can we do to improve our website?
  • What stopped you from completing your order?

Using the third question, customer feedback helped nameOn to understand why conversion rates in Safari were a lot lower that compared to Firefox and Internet Explorer – Checkout could not be completed if Java Script was not enabled.

In retrospect, we could have diagnosed this issue earlier by spending more time investigating the problem in Google Analytics, but without Qualaroo we would still be unaware.

We’re now working on fixing this issue to allow more Safari users to complete their purchase. Based on conversion rates for both Firefox (12%) and Internet Explorer (9%), if we can convert just as many users with Safari then fixing this problem could be worth approx. $60,000 this year.

Find out why people visit your website

And using 4Q’s free online survey plan, we were able to understand what customers came to the website to find. At Hurtigruten, we found that 44% of respondents came to the website to see pricing information. Prices were available in the booking engine itself, but we decided to launch a “price” page with basic pricing information, which linked directly into the booking engine.

This page quickly became the 3rd most viewed page on the website collecting more than 60,000 views in 6 months.

By implementing this page, we sent thousands of more visitors into the booking process and based on conversion rate (0.80%) and average order value ($1600), the improvement is worth between $400,000 and $600,000 per year.

Use heat maps to understand user behavior

Using CrazyEgg Heatmaps, we found that visitors on the SuperOffice product page were mostly clicking the drop down accordions to view content. While feedback from user tests on usertesting.com proved they were welcomed as opposed to a longer page, instead of having the visitor click relentlessly to view content, we tested a variation of the product page.

The variation proved to outperform the control and more leads were generated through white paper downloads and free demo bookings..

Now it’s your turn

You won’t always increase conversion rates immediately with A/B testing but it will pay off in the long term. Use real data to perform the tests. And if you’re struggling to get your boss on board, one of the best ways I’ve heard to get internal buy-in is to tell them you are “conducting market research” and once you have significant data, share this with organization and show them the impact of testing.

  1. Collect feedback from your customers
  2. Test variations using A/B testing software
  3. Share success stories internally/ externally
Conversion rate optimization is no longer optional. Your competition is already doing it and testing ways to create a better user experience in order to convert more visitors. Don’t fall behind and miss out on improving your customer’s web experience.

What are you doing to improve conversion rates on your website? Do you prioritize conversion rate optimization or content marketing? Let me know in the comments below.

Back to Top
Steven Macdonald

Steven Macdonald is a digital marketer at SuperOffice and is based in Tallinn, Estonia. At SuperOffice, Steven writes about email marketing and customer service. You can connect with Steven onLinkedIn and Twitter.

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

How to Ensure Your Organic Traffic Actually Drives Revenue — Whiteboard Friday

How to Ensure Your Organic Traffic Actually Drives Revenue — Whiteboard Friday

Feb 09, 2024
CRO Misconceptions Every SEO Should Know — Whiteboard Friday

CRO Misconceptions Every SEO Should Know — Whiteboard Friday

Jan 12, 2024
How Our Website Conversion Strategy Increased Business Inquiries by 37%

How Our Website Conversion Strategy Increased Business Inquiries by 37%

Sep 15, 2021

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.