By Sally Hall
April 12, 2018
There’s one part of winning that most of us forget to talk about: Losing. The only way to really win big is to take big risks, and those risks sometimes teach you what not to do. In the business of A/B testing and optimization, you’ll have to face some losses before you find a winner. Fortunately, if you A/B test smart, those losses don’t have to show up in your revenue stream or bottom line. Instead, we celebrate those losing A/B tests with our Best Fail of the Week.
Fixed One Problem, Revealed Another
A major pet supply retailer noticed low usage of the search tool on its desktop interface. The marketing team suspected that improving the search function and making the search bar more prominent would encourage customers to find the items they wanted — leading to more purchases and higher average order value.
The retailer ran a series of campaigns comparing search usage when the tool had a larger icon and more prominent placement. In the winning variations, the brand found that usage of the search tool increased by 36.7%. However, in that same segment, revenue decreased by 3.2%.
Glass Half Full
This unexpected result sent them back to the drawing board. It turned out that the website had back-end functionality problems resulting in an ineffective search algorithm. From here, the retailer decided to implement a new search tool, and ran product comparison A/B tests to reveal which tool led to the highest rate of conversions. At the end of this A/B testing period, search usage remained at its highest, but revenue also increased — surpassing the 3.2% loss in the initial A/B test.
By running with their failures, this company was able to experiment, innovate, and improve across features of its digital channels.
To learn more about SiteSpect, visit our website.
Subscribe to our blog: