Studies have shown time and again that A/B testing your emails, when done properly, yields very relevant results. Research and surveys also show that many companies do not employ A/B testing as part of their email marketing strategy. In this article we will give you a leg-up to start A/B testing your emails.

A/B test on subject

The easiest form of A/B testing is trying to find which Subject lines lead to the best open rates. Subject length, phrasing, personalization, use of emojis and capitalization all could impact how likely recipients are to open your email. Personalizing a subject is a tried and true method, but also the tone of the subject, the vocabulary used and the way a subject is phrased (a statement versus a question for example) are important factors.

A/B test on From name and address

Another variable to test on is the From information: name and address. Some cases have shown that using a human instead of a company name and address can increase open rates. This could be reflected in a more informal tone in the subject line. This is anecdotal and likely brand and case dependent, so it’s worth testing what works best for your company.

A/B test on content

You can also test on different content versions. In this case you probably want to use click rate (CTR or CTO) as the deciding metric, unless you’re testing to see if different pre-headers have an impact on open rates. You could even be interested only in clicks on specific links in the content. Variations in the use of images and color palette, the length and tone and message of the textual content and the ways links or buttons are displayed; these variables can all have a big impact on how likely recipients are to click through to your website.

Tips and tricks

Here are a couple of simple guidelines to get better results out of your A/B testing:

  • Think about what results you expect to see from your test and use the test to verify that hypothesis.
  • Focus only on one aspect at a time: an isolated testing variable can lead to more definitive conclusions.
  • Dare to make high impact changes to your different test versions.
  • Use A/B testing for all different types of email: don’t only focus on marketing or sales related content while neglecting transactional messages.
  • Use a control version to make sure your results are not an anomaly.
  • Try to reproduce the outcome of a test with a replication study.
  • Use a large enough sample size and make sure your results are significant before drawing conclusions. You can use a significance calculator to check if your numbers yielded a relevant result.

A/B testing in Tripolis Dialogue.

Setting up an A/B test in Dialogue is very simple. You can create an A/B test directly from the Content page by clicking on the A/B test link in the action bar. We allow for up to 4 different versions (A, B, C and D) and to determine the winning version you can choose between open rate, CTR and CTO (optionally for specific links). The different versions will be sent to random selections from the target audience. It’s possible to send test versions to all contacts in the group, but you can also choose to A/B test on part of the target audience and send the winning version to the remaining contacts.