Today we shared exciting news about the release of a new SiteSpect feature, Origin Experiments®, which helps product teams and marketers A/B test how your site works, not just how it looks.
As websites becomes more complex, visitors are increasingly turning to search functionality to find the items or content they need. This is great for marketers, as internal search reporting will tell you exactly what visitors have searched for. But to optimize deep functional changes and how your on-site search works requires a different approach.
This brings us to an example of how Origin Experiments can be used to improve the functionality of search results.
Jon is a Digital Marketing Manager at an e-commerce store that sells stationery, calendars, and seasonal themed office supplies. His site carries hundreds of different calendar types from beautiful glossy photography calendars that feature flowing grass in scenic areas around the world to castles, historic aircraft, office cartoons, and more. The site also carries versions that have a wood grain corkboard and you can tack important notes to for things to remember. Regardless of the type of office calendar you are looking for, it’s likely that Jon’s web storefront sells it.
The problem is… visitors are having trouble finding exactly the calendar they want. Just yesterday, Stacy, a mother of 2, who loves gardening and growing her own organic vegetables was looking for a gardening calendar. After finding Jon’s site from a Google search, she reached his site and immediately searched for “gardening calanders.” What Stacy did not realize is that she accidentally misspelled calendar and due to the search algorithm features on Jon’s site, it currently does not make any attempt to guess what the user wanted to search for, or ask if they meant to search for a similar term, or even just display results from a term that’s very similar. Unfortunately after Stacy saw “no results to display… please try again,” she left the site and moved on to the next company in the Google search results that sold gardening calendars.
As Jon looks into his internal search logs, he realizes that hundreds of visitors are accidentally misspelling words like calendar in all different variations, such as calandor and calender. As a result, he wants to A/B test a feature from his on-site search provider that suggests alternative search terms and recognizes common misspellings.
However, instead of just turning this feature on to all of his visitors, he needs a way to enable the feature and quantify its effectiveness for his particular audience. After all, search is too important to the user experience, and to Jon’s bottom-line revenue — he doesn’t want to make significant functionality changes without A/B testing them first.
Using Origin Experiments, Jon works with Mark from his web operations team and creates an initial campaign that A/B tests the new search feature with 5% of traffic. That A/B test turns out successful, leading to increased browsing, and ultimately higher sales as a result of visitors finding the items they need, and viewing more items rather than exiting. Jon and Mark then run the A/B test again with 10% of traffic, and again with 25% of total traffic. Each set of results point towards the new suggested search term feature showing increased performance but he’s still only exposed it to a quarter of his audience. Before going further Jon and Mark segment and target an Origin Experiment test for mobile traffic coming from touchscreen devices.
The result of this A/B test also turns out successfully, and Jon and Mark continue to ramp up traffic levels to ensure the new search feature is performing well across all of their visitors. They now have enough supporting evidence and statistical confidence that this search feature will help continue to recapture business they have been losing and increase sales. Meanwhile, they have real user metrics of how their current search feature that does not support the search recommendations is performing and continue to see high page exit rates. This gives Jon the statistical confidence to get buy-in from his team to make this search feature change live to all visitors. Jon uses the QuickChange feature within SiteSpect to push it out to all visitors. On the first day of making the new suggested terms search feature live to 100% of traffic, the store generated an additional $5,236.10 in sales!
In retrospect, Jon realizes that he is now able to implement additional functional A/B tests ranging from search results, to pricing to whole new site redesigns — and quantify the impact of ideas that may or may not improve both the quality of the user experience and the quantity of sales. In addition, he can make these changes without requiring Mark to pull in development resources for an A/B testing project, meaning that he can implement and run A/B tests more quickly.
This story highlights just one use case of how using Origin Experiments opens up new A/B testing possibilities that previously were not available.
To learn more about SiteSpect, visit our website.