Thursday, September 29, 2016

A/B Testing: Here’s How You Could Be Doing It Wrong

David Ogilvy, one of the founding fathers of modern advertising once said "Never stop testing, and your advertising will never stop improving." Back in 2008, Bill Gates remarked that "we should use the A/B testing methodology a lot more than we do."

[caption id="attachment_140440" align="aligncenter" width="1024"]Never stop testing, and your advertising will never stop improving. David Ogilvy, for all of his slim cut full tweed suits, Oxford education and relentless tobacco pipe smoking sure knew what he was talking about as far back as 50 years ago when he famously said "Never stop testing, and your advertising will never stop improving.”[/caption]

In the modern web world, thanks to a variety of user-friendly software tools, split testing and conversion optimization are becoming more commonplace and accessible. With over 330,000 active websites using A/B split testing tool Optimizely, and over 78,000 active websites using Visual Website Optimizer, it's no secret that split testing is gaining in popularity. Of the leading online retailers and businesses, nearly 12,000 have either one or the other of these popular services running.

[caption id="attachment_140443" align="aligncenter" width="631"]A/B Testing Usage - Statistics for websites using A/B Testing technologies - BuiltWith.com The hard numbers on A/B Testing Software Usage from BuiltWith as at September 2016. Optimizely takes the lead both in top traffic websites and the entire web in general (including smaller sites/SMEs).[/caption]

Despite the proliferation of A/B testing software so easy your grandmother could launch her first A/B test, UserTesting.com found that 90% of their internal A/B tests failed. A similar study by VWO found that only 1 in every 7 A/B tests (14%) produced a statistically significant winning result. That means that typically only one test out of ten will have any significant impact on conversion rates, and even then, in most cases, the average level of conversion rate improvement on the most important metrics (online purchases, revenue, leads) is low.

Is the push toward embracing A/B testing just a bunch of hype with no substance? We don't think so. Chances are you're just doing it wrong. Here's what we've observed in the last 5 years of having done A/B testing for more global brands (and smaller companies) than you can shake a (very scientific) stick at.

Rote Testing

In school, Rote learning was when your teachers decided it would be easier to simply get you to memorize the answers or mnemonics so you could pass the test.

The A/B testing equivalent of this is peeking at A/B testing case studies and blogs and mindlessly trying them yourself, on your website with its own unique audience of website traffic and visitors.

If you have a brief search around for A/B testing case studies and blogs, you'll often come across overly simplistic ideas (either explicitly or implied):

  • 'Your call to action has to be ABOVE the fold!'
  • 'Having a video converts better!'
  • 'Long copy beats short copy!'
  • 'Short copy beats long copy!'
  • 'Single-page check-out beats multi-step check-out!'
  • 'Multi-step check-out outperforms single-page check-out!'
  • 'Less form fields are ALWAYS better than more form fields!'

For each of the above, I have seen winning A/B tests that both 'prove' and 'disprove' each of the above (across various market segments and audiences).

Here's an example from WhichTestWon where leading Australian home loan and mortgage broker brand, Aussie, undertook an A/B test with the variation including a video (in addition to some other changes).

Aussie Home Loans A/B Test

I've often heard people talk in very general terms about web video as if it is a certainty that having a video will always make your page perform better. Contrary to that, the version without the video actually generated 64% more mortgage application leads.

If you're thinking about A/B testing in a simplistic ‘rote' way, it's very unlikely that you're going to have an A/B test that produces a 100% or greater improvement in conversion rate or have a greater than average percentage of winning tests.

These dogmatic ideas indicate that you're most likely thinking about A/B testing in such a way that will not produce optimal testing results.

Human beings taken in reasonable sample sizes (after all, that's who we're attempting to influence with our experiments) are more complicated than simply being significantly persuaded by such changes. We're also all quite different, with unique dreams, hopes, aspirations, goals, needs, wants, personalities, problems and psychograms.

"Shotgun" Testing

If you review the past test results of many organizations heavily involved and invested in A/B testing, you'll often come across a phenomenon that I like to call 'Shotgun Testing'.

Continue reading %A/B Testing: Here’s How You Could Be Doing It Wrong%


by James Spittal via SitePoint

No comments:

Post a Comment