Split-testing is one of the cornerstones of conversion rate optimisation (CRO) and it should therefore be a key part of your digital marketing strategy.
Why? Because, as you'll see from this case study, split-testing has the potential to (at least) double the number of sales or leads that you get from your website.
But before we get into that, let's explain what split-testing actually is.
What is Split-Testing?
Split-testing, or A/B testing as it is also called, has been around since long before websites were invented. It's the process of having two different versions of a piece of marketing material - whether that's a web page, an online advert or, in the offline world, a mailshot or flyer - and seeing which one performs better.
Usually, there's only one thing that differs between version A and version B of this piece of marketing - for example, the headline on a landing page.
An experiment is then run in which half of the audience gets to see version A and the other half gets exposed to version B, and a record is kept of how many conversions (sales or enquiries) are generated by each version. This is obviously much easier to do in the online world than it is with printed materials such as direct mail letters, posters, or flyers.
Sometimes, versions A and B will both generate roughly the same number of conversions. On the other hand, you may find that one version ends up having a much higher conversion rate than the other. When that happens, the logical conclusion is that it's the single thing you changed (e.g. the headline) which has caused the winning page to perform better than the other one.
If and when a clear winner is found, you would direct all of your traffic to that version from then on. And then you would want to run another split-test experiment to see whether by changing something else you could improve your results even further.
My Split-Testing Experiment
I recently wrote a book about lead generation which people can buy from this website and download as a PDF eBook.
I created a landing page to promote the book, and I drove traffic to it using a variety of channels such as email, paid ads, and social media.
Sales went well but, after a few weeks, I wanted to see if I could get even better results.
So I decided to change one image on the landing page and run an A/B test to see what difference (if any) this would make.
The original page design (version A) included a picture of me in the header section, as shown here:
On the new version (version B), I swapped this for a picture of the product instead. Everything else was identical across both versions of the page. Here's what the header of Version B looked like:
Running the Split-Test
Once I had got both versions of the page created, I needed a way to automate the distribution of visitors so as half of them got sent to version A and half to version B.
Fortunately this is pretty easy to do. Google has a free tool called Google Optimize which can handle everything for you - from distributing the traffic through to recording and analysing the results. All you have to do is add a short piece of code to your website to make the test work.
However, because I'm a Thrive Themes member, I was able to use the Thrive Optimize plug-in to create and run my split-test and this meant I didn't even need to mess around adding any extra code to my website.
As you can see from the Thrive Optimize screenshot below, during the course of the experiment, each page received roughly 400 unique visitors.
But there was a huge difference in conversions, with the original Version A page generating six sales and version B (the one with the product image) generating twelve. That's a 103% improvement.
Very often you need to let a split-test run for much longer than this and accumulate a bigger number of conversions before you can be confident you've found a winner. However, in this case where one page was performing twice as well as the other, we were able to say with over 92% confidence that a winner had been found after only 18 sales.
So what can we learn from this case study? Well, I think there are two key things to take from it:
- Running split-tests can make a huge difference to your bottom line. Why increase your advertising spend when you could run an A/B test instead and potentially get a massive increase in sales or leads without spending anything extra?
- I don't have the kind of face that sells things!
Have you run any interesting split-tests yourself in the past? Or has this article given you the inspiration to try some now? Leave a comment in the box below and let us know.