Are you thinking about testing on your digital channels, or have you been running an A/B testing program but are interested in learning more? You’ve come to the right place. Think of this blog series as a catalogue of optimization A/B tests, where you can learn about what went right, what went wrong, and where you might be able to test your sites for optimal engagement and risk mitigation.
A private, nonprofit, accredited educational institution with online and on-campus degree programs.
The university sees a correlation between form submissions through its desktop and mobile websites and its overall revenue. The optimization team had just conducted an A/B test on the desktop site which showed that putting specialized and hyper relevant images on form submission pages led to higher conversions. Because of the restricted screen space and the amount of text on the form, the mobile site did not include any photographs on its form submission pages. But, the team suspected that if the right images drew more conversions on the desktop site, they would do the same on the mobile site.
The A/B Tests
The team used SiteSpect to run an A/B test with a 50/50 traffic split. One variation had the original form, and the other showed less text but added relevant images to each page.
Results and Next Steps
The variant page saw an increase in conversions by 2.3%, and a projected annual incremental revenue increase of $1,204,018. The team immediately directed 100% of traffic to the winning variation. Further, the results from this A/B test suggested that condensed text may also be a factor in the increase in conversions, and the optimization team moved on to A/B test condensed versions of text on their desktop site as well.
To learn more about SiteSpect, visit our website.