Minimize Your Risk by A/B Testing Features Before Launch
By SiteSpect Marketing
September 15, 2015
Share
Imagine this scenario: You help run the website for a successful e-commerce company. You listened to your customers and redesigned a key feature on your site. Based on their feedback, you thought that streamlining the search process would lead to increased conversions. But you were wrong and conversions were flat. What happened?
This scenario happened recently to one of our customers. The company received feedback that it was difficult to use the search functionality when accessing the site from mobile devices. To remove this roadblock, the web team redesigned the functionality to meet the needs of the mobile user. In this instance, they condensed a multi-field search engine into a single search field. After several iterations, the design was ready to launch and the redesigned search was live on their site.
Eager to understand the impact of the new design, the A/B test team created a test and collected results. The top level results indicated that conversion rates were flat and the redesign appeared to have no impact. Unsure why the results were flat, the test team dug deeper into the results.
The team focused on how several key segments performed and the results that initially appeared to be flat were actually three sets of segment results that offset each other. In particular, the team looked at their results for desktop, tablet, and smartphone visitors. The new design worked well for mobile users and did, in fact, increase conversions for tablets (+12%) and smartphones (+53%). However, the redesign had a negative impact for desktop users, with a decrease in conversions of (-4%).
On the surface, the 4% decline in conversions for desktop users seems negligible. For this particular customer, however, the desktop still accounts for 88% of site visits. The decline in conversions quickly wiped out any gains received from mobile device traffic and would soon trend negatively due to the high volume of traffic. The test team needed to react fast.
The test team worked with their SiteSpect web optimization consultant to determine next steps. Together, they laid out a plan to revert the desktop search functionality to the initial design while continuing to serve up the new design to mobile users. Had the customer discounted the early results, then the overall negative impact would have been much worse. But they trusted their instincts and discovered the reason behind their results.
Here are a few key takeaways from this real world example that can help you minimize risk:
- Make it a goal to always A/B test new or redesigned features prior to launch. What appears to be, on the surface, a relatively minor change may have a significant impact on your conversions.
- Dig deep when reviewing A/B test results. In this case, the team analyzed the segments targeted to discover an unexpected result.
- React quickly and confidently based on your data. The test team at this customer recognized the problem, devised a plan to fix it, and took steps to re-implement the feature design based on whether traffic originated from a desktop or mobile device.
What other lessons did you learn from this example? Is there anything else you would recommend to other digital optimization leaders?
To learn more about SiteSpect, visit our website.
Share
Suggested Posts
Subscribe to our blog:
[hubspot type=form portal=7289690 id=55721503-7d2c-4341-9c5f-cd34a928a0dd]