Are you thinking about A/B testing on your digital channels, or have you been running an A/B testing program but are interested in learning more? You’ve come to the right place. Think of this blog series as a catalogue of optimization A/B tests, where you can learn about what went right, what went wrong, and where you might be able to A/B test your sites for optimal engagement and risk mitigation.
A large home goods retailer.
This brand wanted to optimize their use of a third party ratings and reviews tool. The brand suspected that increasing the prominence of “featured” product reviews on product list pages and product detail pages would boost add to carts and purchases. Additionally, the tool they were using did not provide reporting on the revenue impact of the ratings and reviews content — which made it difficult to determine the actual ROI of their investment.
The A/B Test
Before A/B testing, the site presented a basic star rating underneath each product image, and a link which took visitors to see further details and reviews. Once on the reviews page, the visitor could choose how to sort reviews and explore further if desired. A/B testing the location and layout of ratings and reviews indicated that a “featured reviews” callout on product detail pages had a big impact on engagement and, ultimately, conversions.
Results and Next Steps
The winning variation with the new “featured reviews” call out saw a 20% uptick in number of items in cart, a 3% increase in average order value, and a 2% increase in revenue per visit. The 90-day profit on the winning variation came to almost $180,000. From these results, the brand rolled out the new feature on select product list pages, and, once they could accurately measure the revenue impact, easily justified the expanded use of the third party vendor.
To learn more about SiteSpect, visit our website.