A/B Testing That Lifts ROI: Not All Testing is Created Equal

JBest

It’s so prevalent in the industry that it’s become a running joke. When someone asks what the best, highest, fastest, most impactful tactic or strategy is? We say, “Let’s test it!” Not sure how to show your team that your brilliant idea is the right way to go? Better test it.

But not all tests are created equal. You can test subject lines until you're blue in the face (or would red work better for conversions? wink), but you may not actually be learning anything you can apply for the next campaign. You may think you have a clear winner… but your sample size is too small. Or version A got a better open rate than version B, but version B got a better click-through rate… How do you decide which one won?

To test the right way in email marketing – meaning, in order to be sure that we’re going to be able to learn and scale our winner – there are 3 fundamental pre-reqs to a test that can actually move the needle for your company:

  1. Define the metrics that determine a winner. Opens may be fine for some tests, but clicks on a CTA, or online conversions maybe more what you’re hoping to get out of the test: It depends on the goals of your campaign. Plus, you don’t want to be at the end of an A/B subject line test, staring at conflicting metrics, not sure what you’ve learned. For example, if version A gets a better open rate, but version B gets a better click-through rate, which will you deem the winner?

    Define your campaign goals and metrics up front, because you’ll need them for #2.
  2. State your hypothesis. Yes, actually write it out. I recently shared this in a lab and the marketers pointed out “Well, we don’t KNOW which will work. That’s why we’re testing it!” True, but you may have some preliminary data or an inkling and here’s why it’s important to state it ahead of time: If you just test two things against each other… You may not actually learn much. Take a stand and state which version will create lift (based on your metrics above) so that you can set up your test to prove/disprove that hypothesis.

    For example, let’s say you set a hypothesis that Thursday or Friday will result in better online sales. Or that shorter subject lines work better than longer subject lines.
  3. Thrivent A B Split Test Subject Line conflicting results

    Then, write a test that will really put that hypothesis, well… to the test! In fact, try to prove your hypothesis WRONG so that if it wins, you know it wasn’t just your confirmation bias.

  4. Determine statistical significance. In the example above, version B got more clicks on the primary CTA, but we’re only 53% certain it was due to our test variation. Don’t make the mistake of printing that “I was right!” banner for your office if the difference in versions comes down to a handful of conversions.

    Avoiding this miss starts with not making your test segments too small. When you’re measuring by email metrics like opens or clicks (rather than conversion), typically a segment size of at least 10,000 is a good minimum for testing. For example, you could perform a 10/10/80 test with 100K: send version A to 10K, version B to 10K and after a designated about of time, determine a winner and send the remaining 80K the winning version. If your target metric is conversions, you may need larger segments and a bit more time to let the results “settle in” for a real finding.
  5. If you have fewer contacts than that, you can still test! You might want to aim for a 50/50 split instead and then check your statistical significance. Pro Tip: You don’t have to remember the statistical significance equation from Stat class in college. Mercifully, there’s a free online Statistical Significance Calculator here.

    Thrivent A B Split Test Subject Line not statistically significant click rate

    Want a cheat on learning from A/B testing? You can subscribe to organizations like BounceX (now owners of what used to be Behave.org or Which Test Won) or Marketing Sherpa/MEC Labs to see test results in case study format and maybe get inspired for your next test.

    Looking for more data that drives email marketing results? Check out my last OI article: Getting Credit Where Due: Email Attribution Ideas for Every Business.

    Title: A/B Testing That Lifts ROI: Not All Testing is Created Equal
    by
    About: Email Campaign Testing
    Audience: Email Marketers
    Publisher: OnlyInfluencers.com
    Copyright 2018, Only Influencers, LLC