5 Best A/B Testing Case Studies and Learnings

Designing an effective website requires striking a balance between art and science.

On the surface, web design may seem like a primarily creative endeavor. After all, colors, fonts, and images are all somewhat subjective, and the ones that are right for your business depend on your audience and the brand you want to convey.

But an aesthetically pleasing design shouldn’t be your only concern.

When it comes down to it, your website’s primary goal is to get results for your business — in the form of sales, leads, form submissions, and other conversions that matter for your goals. This means that it needs to not only look great but also provide a seamless browsing experience that drives your visitors to take action. Fortunately, you don’t need to leave accomplishing this goal solely up to your designer.

Instead, you can take a data-backed approach by testing different elements to see which generate the best results. And while there are a few different ways you can approach this process, one of the best is with A/B testing.

What is A/B testing?

By this point, most experienced marketers have at least heard of A/B testing, even if they haven’t run a test themselves.

And the basic concept is a simple one. It involves creating two variants of a page, then testing them against one another to see which is more effective in achieving the desired goal. These tests are run by dividing visitors between both versions so that the results represent user behavior over the same time frame.

For example, let’s say you weren’t sure whether to make a specific element on your pricing page red or blue. Instead of making this decision based on your personal preference, you could use A/B testing to see if either option had an impact on conversions.

First, you’d create one variant with the red element and one with the blue. Then, for a straightforward test, you’d split your traffic evenly between them.

At the end of the test, you’d examine your results to see if one version generated a higher conversion rate for the page’s target goal. If one variant outperformed the other, you could implement it permanently, with the goal of achieving that higher conversion rate for all of your visitors.

Of course, this is an extremely simplified hypothetical test, and there are many ways to alter the A/B testing process to suit your needs. But for the sake of a general explanation, it works — and it’s all you need to know to benefit from the case studies on this page.

Related Article: A beginner’s guide to A/B testing

5 A/B testing case studies

The possibilities for A/B testing a website are virtually endless.

Everything from your layout and graphics to your CTA copy and buttons can have an impact on conversions, making all of these elements fair game for testing. And while that means you have plenty of opportunities to uncover conversion-boosting improvements, it can also make it difficult to know where to begin.

That’s why in this post, we’ve compiled five A/B testing case studies you can use as inspiration.

1. Arenaturist gets more form submissions with vertical layout

Arenaturist.com is a hotel and resort booking website, so their most important conversion goal is, of course, getting visitors to book accommodations.

That’s why their homepage prominently features a form asking for details like the dates a user is traveling, their destination, and the number of people they’ll be traveling with. Collecting this information is an essential first step in the process. As a result, the company decided to run an A/B test on their form, with a focus on creating a new version that earned more submissions.

The original, or “control,” version of their homepage featured the form in a horizontal bar just above the fold.

arenaturist-hp-design1

For the test, they designed a new, vertical variant.

arenaturist-hp-design2

The colors, fields, and copy remained entirely the same. And while the new version appears to take up more space on the page due to its placement, the font and button size remained the same.

So, was this enough to make an impact?

Yep.

The variation generated a 52% higher conversion rate for the site than the original. And when you consider that this form is the first step in the booking process, that’s a lot more potential revenue for the site.

2. ATLAS Workbase boosts conversions by rephrasing exit offer

When coworking space ATLAS Workbase wanted to earn more members, they focused on getting their site visitors to check out their facility in person.

As a result, they added the following exit-intent popup to their site.

book-a-tour

This popup earned the company around two or three scheduled tours per week — but they knew they could do better.

As a result, marketing agency Logic Inbound suggested changing the primary call to action from “Schedule a Tour” to “Start Your Free Trial.”

This way, instead of simply walking through space, prospective members could show up and actually try it out for themselves.

And this approach worked. The pop-up that framed the visit as a “trial” instead of a “tour,” converted at 25.71% — an 852% increase over the original.  So while many marketers focus their A/B tests on visual elements, it’s important to remember that these aren’t the only elements you can test.

In some cases, adjusting your copy or offers is much more effective, and can be exactly what it takes to achieve the conversion rates you want.

3. BeemDigital generates more downloads by moving testimonials

Many businesses publish ebooks, guides, and other resources as a way to generate leads. But for this strategy to work, the business’s target audience has to download them. One of the best ways to convince them to take this action is including customer testimonials.

The logic here is that if a visitor can see that other consumers have already read, enjoyed, and gained value from a resource, they’ll be more compelled to read that resource themselves.

So before publishing an ebook, Ben Gheliuc of BeemDigital shared it with a few industry professionals. Then, he asked for a quote from each of them and added them to the ebook’s landing page.

At first, he placed all of these testimonials below the download form. And considering that the page’s primary goal is to generate downloads, it made sense to position it as high as possible. Then, he decided to see whether featuring some of his best testimonials above the form would have an impact on conversions.

Placing testimonials before the form boosted the page’s conversion rate from 38.49% to 63.33% — for a whopping increase of 64.53%.

This goes to show that even if you think you know the best way to arrange the elements on a page, you might be surprised by what works for your audience — and the only way to know for sure is to test.

4. Unveil earns more beta users by removing pricing information

When UK-based marketing agency Tone launched a new design app called Unveil, their primary goal was to gather leads for a free beta period. While these users would eventually be asked to sign up for a paid product, Tone’s main goal within the beta period was to collect feedback they could use to improve the product.

Initially, they decided to give visitors context by including information about the pricing plans they’d be introducing at a later date.  As you can see in the original version, the copy at the top of the page explains that Unveil’s eventual pricing would start at “just $1 per month.”

Control-top-half

But because the trial itself was free, they decided to see what would happen if they removed that pricing information — even though it was clear that trial users didn’t have to pay it.

As a result, they created a variant with no reference to money or pricing.

And by removing those references, they were able to earn 31% more signups for their beta trial — illustrating that sometimes, the best way to get your visitors to take action is to eliminate all of their reasons not to.

5. Brookdale increases conversions with a static image

Brookdale Living offers community and assisted living for senior citizens. But before optimizing their site, their “Find a Community” page didn’t do much to encourage visitors to learn more.

Brookdale-Control-e1399472362259

Marketing agency Fathom made the page more compelling by adding testimonials, credibility logos, and USPs. They also wanted to add some visual content but weren’t sure whether a static image or a video would be more effective for this goal.

As a result, they tested variants with both. First, they created a version of the page with a photo

Then, they created a version with a video of several residents talking about their experiences with Brookdale.

Now, if you pay attention to marketing trends, you might think the result here is obvious. Trends widely tout video as the most popular, most engaging, and most effective way to communicate with consumers.

So the second variant had to win, right?

Nope.

In fact, the variant with the static image outperformed the variant with the video and resulted in $106,000 in additional revenue for the company.

As this case study shows, it’s not always a good idea to rely on trends or best practices to shape your site.  After all, your goal isn’t to create a site that’s in line with marketing trends — it’s to create one that works for your target audience.

And sometimes, those two things aren’t necessarily one and the same.

Conclusion

Designing a high-converting website can be challenging. Fortunately, you don’t need to rely on your instincts to create a site that gets the results you want.

Instead, you can use A/B testing to take a data-backed approach to choose the elements your site visitors prefer. This way, you can be confident that you’re designing a website that’s in line with your target audience’s unique preferences.

As the five case studies on this page illustrate, you might be surprised by what you learn — and the only way to find out is to start testing.