Beginner’s Guide to AB Testing
Number of email opens ground to a halt? Click-through rates at a standstill? Even if your company is seeing an increase, good marketers know there is always room for improvement…and nothing drives greater improvement in campaign performance than testing, testing, testing!
In our beginner’s guide to ab testing we will go over how to conduct an AB test, what you should be testing, and how to act on your findings.
What is AB Testing?
A/B testing, or split testing, is an experiment where you test two different versions of a landing page or email simultaneously to work out which is most optimised for conversion.
It’s basically nothing more than the application of a simple scientific method to your online marketing efforts. It’s not time consuming and there is all sorts of software – such as marketing automation platforms – to make it really easy. Or, as long as you carefully record your results, you can do it without any special software at all!
How Do I Run an A/B Test?
A/B testing is a way to compare two landing pages or emails – A and B.
A and B are identical except for one independent variable. This independent variable could be anything you wish to study.
A few examples:
- Entire emails and landing pages
- Headlines and subject lines
- Incentives and offers
- Send times
And obviously there are many sub-tests you could do for each of these elements – colour, placement, size etc. etc. – the options are endless…
For some more ideas about tests to run read 10 Things You Should be AB Testing.
By testing the audience’s response to A and B, it is possible to determine which of the two is the most effective in increasing click-through rates, open rates, response rates or conversion rates – allowing you to optimise your content based on your audience preferences.
You could also try running AB tests to compare multiple variables at the same time. Do users respond better to image A next to a short form? What about image B next to a longer form? How does this change when the landing page headline is altered?
In order to give accurate results, marketers MUST test the two groups at the same time. Otherwise external variables such as time of day, day of the week or media trends – which are impossible to control – may contaminate the results.
ALSO! Make sure you let your A/B test run for a sufficient amount of time. This is indicated by obtaining a sample large enough to have statistical significance.
A Bit of Proof That it Works…
Optimisation was the name of the game for the Obama Digital team during the 2012 election. The Deputy Director of Frontend Web Development at Obama for America revealed they optimised just about everything from web pages to emails!
“Overall we executed about 500 A/B tests on our web pages in a 20 month period which increased donation conversions by 49% and sign up conversions by 161%. As you might imagine this yielded some fascinating findings on how user behavior is influenced by variables like design, copy, usability, imagery and page speed.”
So…if it’s good enough for the President of the United States…
Why You MUST A/B Test?
When you conduct A/B testing, your audience is telling you what they like – and what they don’t – about your online marketing efforts. You can then make the incremental changes that keep your buyers engaged, lowering bounce rates, improving conversion rates, and most importantly, increase sales.
And Don’t Ever Stop Testing!
Your audiences preferences will certainly change over time, so your marketing should adapt. Running A/B tests on an ongoing basis means you can update your landing pages, forms, emails, images, and everything else to keep up with changing trends.
Won’t I Lose Subscribers?
“But if I start testing different elements, won’t I lose my subscribers in the process?” I hear you cry!
The fact of the matter is that you can use A/B testing in a way that will not hurt your current click through and open rates.
Many marketing automation platforms allow you to A/B test on a small percentage of your subscribers and then use your sample findings to predict how those results will translate to your entire subscriber list. For instance, you could send 10% of your subscribers an email at your regularly scheduled time (Version A) and then send another 10% of your subscribers an email at a different time (Version B).
Once you know which time triggered better performance rates, you can then use that winner as the designated time for emailing the remaining 80% the following day.
Organise and Consider Your Findings
Once you are confident with the results your testing has obtained, you want to organise your findings into a spreadsheet.
This will allow you to keep track of what you’ve tested, identify patterns and make it easier for you to apply your findings to other aspects of your marketing.
For instance, if you find that using images of people on your landing page as opposed to images of products improved click-through rates. You can apply this to your adverts or homepage.
It won’t always be obvious, but try to consider why you got the results you did – particularly if they are surprising. For example, even though you may have placed a CTA is at the top of the page, giving great visibility. It might have been placed before the user has any context so they won’t know what they’re being asked to sign up for.
Got any surprising results from your own AB tests? Lets us know! Comment below…
Pardot: The ABC’s of AB Testing, by Jenna Hanington
Marketo: Better Results Through Testing
Marketing Automation Software Guide: Why A/B Testing Is Essential To Your Marketing, by Nathan Yerian
Lander: What Is AB Testing?
Hubspot: 4 AB Testing Tips for Begginners, by Jeremy Ellens
Lander: What is AB Testing