There's plenty of literature on the merits of A/B testing, but little information on its pitfalls. Most people in ecommerce, tech and startups would have you believe A/B testing is a must do activity when, in reality, it's often unnecessary and counterproductive.
At Sticker Mule, we recently stopped actively A/B testing and I thought it'd be helpful to explain why. I should preface this by admitting we ran more than 70 conversion tests before making this decision and had only 3 significant winners. I don't know a normal ratio of tests to wins, but I suspect our situation is common. This isn't to say we don't believe in constantly working to improve conversion--we've found different ways, which I detail later.
The 6 hidden costs of A/B testing
Resources are finite. Time spent on A/B testing cannot be spent elsewhere. How you use your time determines your growth trajectory. A/B testing is a growth tactic, but it's not always the best use of your resources.
2) Lost conversions
Most conversion tests fail and underperforming variations cause you to lose conversions while they run. These lost conversions are an investment you make to improve your long term conversion rate, but don't forget they are a real cost.
Related to lost conversions, A/B testing tools make your site slower and this also reduces your conversion rate--even for the control. The additional conversions you lose because of this performance hit are another, often ignored, cost of A/B testing.
A/B tests potentially confuse your team and customers. Customer service may struggle to properly service a customer if they don't know which variation the customer is viewing. You can educate customer service about the various tests your running, but, regardless, you're making their job more difficult.
A/B testing slows down your organization's decision making. Some ideas are obviously good and adhering to a strict A/B testing process reduces your time to go live. Time is finite and the number of improvements you implement per year has a major impact on your growth trajectory.
Usually, organizations congratulate themselves for investing heavily into conversion. This is evidenced by a Head of CRO or paying consultants. The largest cost is not the salary of those persons, but changing the focus of other team members such as designers, customer support, and developers. The success of a Head of CRO is usually at the cost of making other colleagues less successful in their primary roles.
How we improve conversion without A/B testing
Stepping away from A/B testing does not mean we have abandoned efforts to improve conversion. Rather we are improving our conversion through other methods (yes, A/B testing isn't your only option).
You can monitor your long term conversion trend rather easily via Google Analytics to gauge if you're making good decisions. And you can add more precision to this method of conversion rate monitoring by looking at your conversion rate trend by channel.
With that in mind, we're doubling down on the following to improve our conversion rate:
Adhering to thoroughly researched usability principles generally yields good conversion optimization results.
In addition to usability principles, there are basic design principles that tend to improve conversion when followed.
The golden rule
"Do unto others..." is a guiding principle for most of our decision making and it works great with regards to conversion optimization.
Solving actual pain points our customers experience is the primary way we've been able to improve our conversion rate and strengthen our brand.
We haven't abandoned A/B testing entirely
I should admit that we still A/B test some ideas. Rather than adhering to a strict A/B testing process, we let our team to A/B test ideas they're curious about.
To minimize the likelihood of testing too much, we turned off our 3rd party A/B testing tool and integrated split testing into our code base. That means, in order to run a test, we need help from development to set it up. This extra effort discourages running careless tests and has the added benefit of eliminating the performance hit that comes with using a 3rd party tool.
Finally, we're now able to A/B test entirely different versions of our code which is much more interesting than merely testing basic design changes via a 3rd party tool.
If our approach of improving conversion rate by solving problems rather than using an A/B testing tool appeals to you, you share some of the characteristics we look for in team members.
Feel like you'd fit in well with our team? Apply to join us. We'd love to hear from you.