In this post, we’ll look how to measure and communicate the results of your website optimization program.
A/B testing platforms and analytics applications are naturally related: One is fundamentally designed to drive improvements on your site, and the other is fundamentally designed to help quantify the value of those improvements. Although some A/B testing applications do a good job of helping quantify success on their own, the integration of A/B testing with analytics allows companies to develop a very broad view of the results of individual A/B tests.
On the subject of measurement, one point is very important and frequently overlooked: The combination of measurement and A/B testing should support both optimization and an incremental learning process about your visitors and customers.
Assuming you have a robust measurement program already in place, the integration of A/B testing into those efforts is often a trivial effort and requires little more than patience. The upside from taking the time to integrate these systems correctly includes the ability to evaluate A/B testing efforts over multiple criteria and the ability to evaluate A/B test participant behavior over multiple sessions.
Seemingly dramatic design changes often have no significant impact when examined using simple measures such as click-through and conversion rate. An increasing number of companies have started applying more complex measures, such as “return visitation rate” and “lifetime customer value,” and using more qualitative measurement systems to develop a more holistic view of A/B test impact.
Although the specific measures you take will likely vary from A/B test to A/B test, depending on the systems you have in place, an important consideration is the ability to integrate those systems. The basic integration of A/B testing and measurement systems involves exchanging data about A/B test participation, either by after-the-fact bulk data loading or by real-time tag transformation. Done well, such integration allows the measurement team to create segments, build key performance indicators, and drill down into the activity of individual visitors based on A/B test participation (via the use of data warehousing and customer experience management technologies).
The integration of measurement and A/B testing is designed to help you better quantify A/B test performance and the resulting impact on the business. Integrated systems, when properly used, support a wide range of metrics and measures and support the analysis of both short- and long-term impact of A/B tests. The long-term view towards A/B testing is one that few companies currently take today, but that view most often reveals a great deal of insights about visitor behavior.
Communicating Your A/B Test Results With Actionable Analysis
Once you’ve formed a great A/B testing team, gotten your senior stakeholders on board, and received solid testing plans back from the organization, you owe it to the organization to communicate back what you learned after you ran your A/B tests.
To take full advantage of your investment in A/B testing, consider taking a two-pronged approach to communicate the results by specifically addressing the needs of your two audiences—the group requesting the A/B test, and the wider organization:
- For the group that requested the A/B test, go deep. Plan to provide detailed information about the A/B test and what was learned, including a reiteration of the A/B test plan, an A/B test timeline, a summary of resources used in the project (ideally tied to costs associated with their time), plus as much detail about the results of the A/B test as possible, including the underlying statistical data. In our experience, mapping external events onto the A/B test results, whenever possible, is an excellent idea. For example, if you know a substantial email campaign occurred, or if the New York Times wrote about your company, add a point on your results timeline that specifically addresses those events.
- For the wider organization, go broad. The rest of the company is likely not as interested in the details of the A/B test as they are the “big picture,” including the bottom-line summary of results and a comparison of how similar A/B tests have performed in the past.One thing that gets people’s attention better than anything else is associating A/B test results with money—either incremental revenue or operational savings. Funny how the addition of “and we expect this change to result in an estimated $20,000,000 in incremental revenue every year” gets people’s undivided attention.
For both groups, make sure you take a bottom-line, up-front approach toward the communication of results. By doing so, you dramatically increase the likelihood of getting people’s attention to important news, and allow them to make their own decision about the level of attention they give to the results.
Whenever possible, present A/B test results in person and in a format that encourages interaction. Especially when companies are new to the A/B testing process, such social interaction helps to improve organizational awareness about A/B testing while challenging the A/B testing team members to continually improve their ability to present, discuss, and defend A/B test results. The value of the latter cannot be underestimated, because so very few employees in the workplace today have experience with A/B testing and optimization.
Consider documenting A/B test results electronically using an internal blog or wiki. The wiki format especially lends itself to creating a complete body of knowledge—what was the A/B test plan, what were the measures of success, what happened, and what was learned—in a searchable format. When you start to aggressively A/B test, people’s collective ability to recall every A/B test and all that has been previously learned becomes limited. Having a resource that can be easily searched at the onset of a new project will allow the A/B testing team to build on its knowledge and not repeat past mistakes.
The most important thing when presenting A/B test results is to provide the audience with actionable recommendations. Nothing is less satisfying than a great presentation chock full of data with no recommendations for action—especially in the current and decidedly harsh economic climate. The best A/B testing teams will go the extra mile and make multiple recommendations in an attempt to capitalize on the results and build consensus for the adoption of the successful challenger design.
Use the presentation of recommendations to suggest other opportunities for A/B testing. The ultimate goal is to build a culture of A/B testing within your organization, one that is willing to increasingly incorporate data from measurement and A/B testing efforts into the overall business decision-making process.
A/B testing can be a powerful ally to business—but only when the organization takes the time to understand what can be learned and then act on that information. The worst-case scenario in A/B testing and optimization is one where measurement is guiding A/B testing and A/B testing is informing good recommendations that are ignored and not acted on.
(For more information, download the whitepaper titled “Successful Web Site Testing Practices: Ten Best Practices for Building a World-Class Testing and Optimization Program.”)
To learn more about SiteSpect, visit our website.