Getty Images/iStockphoto

4 email A/B testing best practices

A/B testing can help marketers improve email campaigns one variable at a time. Best practices include designing the right test, starting with the sender's name and using AI tools.

Experienced email marketers know how to engage their audiences, but A/B testing can optimize their efforts.

Each email a marketer sends contains various elements, such as a subject line and a preheader, that organizations can fine-tune to improve open and click-through rates. Presenters at HubSpot's Inbound 2023 conference in Boston shared best practices for email A/B testing -- a testing method that lets marketers see how subsets of their audience react to different versions of emails.

A/B testing can help marketers improve engagement and increase revenue. However, marketers must know which variables to isolate and how often to run tests.

What is A/B testing?

A/B testing, also known as split testing, describes experiments marketers can run to compare the performance of one email variation to another. These tests, which most marketing platforms can automate, send two different email variations to a subset of an organization's contact list. After a period, the system evaluates which version performed better and sends the winning version to the remaining contacts on the list.

Common variables that email marketers test include the following:

  • Sender name.
  • Subject line.
  • Preheader.
  • Email copy.
  • Image size.
  • Call to action.
  • Link quantity.
  • Time of sending.
  • Email length.

Email testing best practices to adopt

Marketers can follow email testing best practices to identify and strengthen weak points in their campaigns.

1. Design the right test

A/B testing offers empirical data that can improve marketing efforts, but each test still requires a human to set the parameters. The effectiveness of an A/B test depends on the test's overall design, said Chris Eichelsheim, head of inbound marketing at Dtch. Digitals, a marketing agency based in the Netherlands.

A chart that shows how A/B testing compares two versions of one digital asset or product.
Email marketers can use A/B testing to compare the effectiveness of one email version to another.

First, marketers must decide on a variable to test, which depends on their specific needs. For example, an email might have a high open rate but a low click-through rate (CTR). In this case, a marketer might A/B test the copy of the email body, as opposed to the subject line, because the problem lies in the CTR.

Then, marketers choose an appropriate sample size for their test. Larger sample sizes typically offer more accurate results than smaller ones.

Ideally, marketers should choose a sample size large enough to obtain statistical significance at a confidence level of 95%. Statistical significance measures the likelihood an experiment's results are real and not from chance. A 95% confidence level means test results are accurate with 95% certainty. Online calculators can help marketers find the appropriate sample size to achieve this level of certainty.

For example, marketers with over 1,000 contacts might test on about 20% of their audience, so 10% receive version A and 10% receive version B. After a period, the marketer identifies the winner and sends that email to the remaining contacts. This ratio lets marketers test enough people to generate statistical significance at high confidence levels and lets the majority of contacts receive the more effective email.

However, organizations with smaller contact lists should test a larger percentage if they want to achieve high levels of statistical significance. For example, if marketers with a contact list of 200 test a subject line on 20% of their audience, only 20 people would receive each version. Four people might open version A, whereas six might open version B, but marketers cannot confidently say version B is more effective based on such a small sample size.

Inbound 2023: Themes and Recap

2. Start with the sender name, subject line and preheader

When marketers launch new email campaigns, they can begin A/B testing on the sender name, subject line and preheader before they test the email copy itself. Recipients can only see these three elements before they open an email, so marketers should prioritize them.

"If people don't open your email, who cares what's in your email? No one's seeing all this beautiful artistry you did. ... We need to spend time and energy on the things that get the email open," said Jay Schwedelson, CEO of Outcome Media, a marketing services company, in a session called "Debate: Get the Open! vs. Get the Response!"

Marketers can run various tests to improve open rates. For example, they can test different sender names, such as "Company Name" vs. "Joe from Company Name." They can also test numbers that end in zero or five in the subject line versus other numbers. For example, "7 tips for retail leaders" might generate a higher open rate than "10 tips for retail leaders." Additionally, organizations can experiment with emojis in the preheader, Schwedelson said.

An email's sender name, subject line and preheader have a huge effect on a campaign's success, so marketers should start there. After they test these elements, they can move on to other aspects of the email, such as the header, call to action or overall copy structure.

3. Supplement A/B testing with AI tools

Marketers shouldn't only limit themselves to A/B testing but instead use different types of email tests in combination with each other to optimize campaigns. For example, marketers can find free tools online to evaluate their emails and spark ideas.

"A lot of free services ... will scan your email and give you pointers on [how] to ... make sure that you won't enter the spam trap," Eichelsheim said.

Organizations can also use free generative AI tools like SubjectLine.com and ChatGPT to grade their marketing content, generate subject lines and offer tips. For example, a marketer can paste email copy into a generative AI tool and ask, "How can I make this email sound more exciting?" or "How can I add a sense of urgency to this email?"

AI tools can help marketers avoid spam traps and improve open rates and CTRs. Additionally, they can help users brainstorm ideas for email copy that they can run through A/B tests.

4. Test everything

Innovation in marketing can begin with an idea that sounds a bit different or untraditional. For example, a marketer might want to send a promotional email with nothing in the subject line but an emoji.

A marketing supervisor's knee-jerk reaction to this idea might be that it won't work. However, if marketers find themselves doubting a new approach or idea, they should still test it, Schwedelson said in a session called "ENCORE: New Email Marketing Test Ideas and Pitfalls to Avoid."

A/B testing helps marketers use empirical evidence to find more effective marketing strategies. To optimize campaigns, marketers should use email testing best practices and test all the ideas they can to improve their campaigns and learn more about their audiences.

Dig Deeper on Marketing and sales