What is A/B testing?

A/B testing is a method of experimentation used by marketers to compare two versions of a variable such as a web page, email, or digital ad, and then determine which version performs best and, more importantly, why.

By running A/B tests, marketers can evaluate the impact of changes made to content and audience targeting through metrics such as reach, engagement, click-through rate, and conversion.

The two versions are referred to as ‘A’ (control version) and ‘B’ (experimental version). Through this method of experimentation, marketers are able to continually improve results by determining which version performs the best and why, and applying those learnings to future campaigns.

What is an example of A/B testing?

A/B testing typically begins with a hypothesis that marketers want to prove. For example, a sportswear brand might decide it wants to test the hypothesis that using models in product photography will drive higher sales.

In order to A/B test this hypothesis, the brand builds two versions of the same landing page, one with product photography featuring models and one without.

The brand then runs a split test by sending half of their website visitors to one version and the other half to the other version. By tracking the number of conversions (in this case, sales) they receive from each LP variant, they can either prove or disprove the original hypothesis that product photography featuring models sells more clothes.

The data from this A/B test can then be used to optimize the user experience across their site even further, thus creating a virtuous cycle of A/B testing, optimization, and continuous improvement.

What are the most common metrics marketers A/B test?

Time on page: This metric measures how long users stay on a page before leaving. A/B testing can track whether one version keeps users engaged longer than the other.

Bounce rate: This metric measures how many visitors leave a website after viewing only one page. A/B testing can help determine which variation leads to fewer “bounces” and more conversions.

Click-through rate (CTR): This is the percentage of users who click on a banner ad or link. A/B testing can be used to test which version of an advertisement gets more users to click.

Conversion rate: This is the percentage of users who take a desired action, such as signing up for a newsletter or buying an item. A/B testing can be used to measure which version produces a higher conversion rate.

What are common mistakes marketers make when A/B testing?

  1. Not setting clear goals

    Without clearly defined goals and criteria for success, it’s difficult to know whether an A/B test has been successful. For example, a company may want to increase the number of newsletter signups on their landing page, but because they haven’t clearly defined what “success” looks like in terms of conversion rate or other metrics, they can’t properly compare or evaluate results.
  2. Not giving tests enough time

    Without clearly defined goals and criteria for success, it’s difficult to know whether an A/B test has been successful. For example, a company may want to increase the number of newsletter signups on their landing page, but because they haven’t clearly defined what “success” looks like in terms of conversion rate or other metrics, they can’t properly compare or evaluate results.
  3. Running conflicting tests

    Marketers should avoid testing more than one variable at a time — otherwise, the results could be misleading since the individual changes can’t be properly evaluated. An example would be testing ad creative that has different imagery and different ad copy. In this example, it would be impossible to say whether it was the image or copy that performed best since both changed.
  4. Manual processes and disparate data

    Any A/B testing that relies on the manual collation of data from various sources is prone to human error and inconsistency. Automated marketing platforms and analytics tools (such as Google Analytics) are a far better option when it comes to both running A/B tests and analyzing the resulting data.

A/B testing with Emarsys

Emarsys works with leading brands like PUMA, Pizza Hut, and Nike to increase their acquisition, purchase frequency, average order value, and retention by helping their marketing teams power effective A/B testing at scale, and unifying their customer, sales, and product data to enable personalized omnichannel engagement.

Deliver the predictable, profitable outcomes that your business demands.

Explore the Platform