Skip to content
Let's Talk
The 5 Most Frequently Asked Questions About A/B Testing in Marketing

The 5 Most Frequently Asked Questions About A/B Testing in Marketing

August 20, 2014


By Dolly Howard

The-5-Frequently-Asked-Questions-About-AB-Testing-in-Marketing

Wouldn’t it be great if we could always know how people think and act to the cues and stimuli we present? Then we would be able to properly create websites and offers that would have 100 percent conversion rates. How wonderful would that be?

Alas, in the real world this isn’t true.  Which is why it’s very important that we use A/B testing to find the best way of getting our message across to our audience.

A/B testing, otherwise known as split testing, is a method in which we can compare two variables to determine which one produces the best response from our visitors. By doing this, we are able to learn more about the behavior of our target audience to improve our messaging and bring in more traffic and, thus, more conversions.  According to HubSpot, testing landing pages can increase lead generation by 30-40 percent for B2B websites and 20-25 percent for e-commerce sites.

Knowing how crucial A/B testing is for increasing leads, it’s important that you begin experimenting as soon as possible. Here are just a few of the most frequently asked questions about A/B testing to help get you started: 

1. Why do A/B testing?

If you’ve read any books about startups, you’ve probably heard the principle about failing your way to success or learning through failure. This rings true for just about most things in life—and marketing is no exception.

Marketing is always changing and the best way to learn what works and doesn’t work is through trial and error. But you don’t want to rely on just guesswork and this is where A/B testing comes in play. Remember: Your target audience is a collection of diverse and real individuals; they will not always act in the way we expect.

2. What should you test?

Start simple. Figure out the page or email that you want to test, then determine what element (also called the variable) that you want to test. To choose the best pages to test, ask yourself these two questions:

  1. Which pages have the highest traffic? 
  2. What is your primary call to action (CTA)?

Landing pages are great for A/B testing because they are somewhat secluded and shouldn't affect the rest of your site pages. Additionally, when deciding on which variable to test, consider which would affect conversion the most. For example, CTA buttons and copy are typically the first variable to be tested, but you can experiment with headers, email subject lines, images, and so on.

3. How long should the experiment run (until it's considered successful)?

First of all, you must make sure you have a clear goal for the test; when you have no clear winner, then your test is inconclusive, and you should allow the test to continue. Nonetheless, there are certain factors that you should consider when determining whether to end the test and declare a winner.  

Statistical Significance

The first number to consider is the statistical significance, which should be 95 percent or higher. If you have a statistical significance of 99 percent, this means that there is a 1 percent probability that the data is wrong. In essence, this is saying that it is very unlikely that the results are because of chance but are caused by the specific change that you introduced.

Standard Deviation 

Another number to consider is the standard deviation of the conversion rate. The standard deviation measures the amount of variation from the average. Whereas the conversion rate is what you use to measure the success of an A/B test, you want to make sure that the ranges of the conversion rate (aka conversion range) don't overlap. There should be a clear distinction between the conversion ranges of the two pages tested.  

For example, if you have a standard deviation of 1 percent and a conversion rate of 5.5 percent for page variation A, your conversion range is 4.5 percent to 6.5 percent. If page variation B has a standard deviation of 2 percent and a conversion rate of 7 percent, your conversion range of this page is 5 percent to 9 percent.  As you can see, there is overlap in the conversion ranges of the two pages.  When this happens, you want to allow your test to continue until you have a clear distinction between the two conversion ranges, and thus determine a clear winner.

Sample Size

The third number you must consider before pulling the plug on your test is the sample size. This represents the number of people who took part in your experiment and, for statistical significance, this number should be large. HubSpot recommends email A/B tests having at leasts 1,000 contacts, but you can also use a calculator to determine whether you've got a significant sample size. 

4. How often should we run A/B tests?

Now this is a question that doesn’t have a numerical answer because this depends on whether you have a good reason for testing. Every test you do must have a clear goal in mind. 

5. What is multivariate testing? How is it different from A/B testing?

Although A/B testing allows you to test two variables to choose the better of the two, multivariate testing allows you to test many variables simultaneously. However, statistically significant multivariate testing requires an incredibly high volume of traffic (i.e., the type of traffic that Google, YouTube, and Facebook get).

What now? 

A/B testing doesn't have to be difficult, and there are many software platforms with this feature available, including HubSpot. If you have questions or thoughts on our take on A/B testing, please let us know by tweeting us at @smartbugmedia!

 

 

The-Evolution-of-Sales-&-Marketing-Relationship-cover

Get sales on board with your marketing strategy with:

The Evolution of Sales & Marketing Relationship

Check It Out
Topics: Marketing Strategy, User Experience, Digital Strategy, Conversion Rate Optimization