Case Study: Quantifying the Impact of Page Speed

By SiteSpect Product Management

June 24, 2019

Share

The UK’s leading health retailer, specializing in items such as vitamins and nutritional supplements, food items, healthful cosmetics, and more had been using SiteSpect for web optimization and personalization. The brand has over 1,300 stores across the UK and elsewhere, and a fast growing online retail division. This online growth is fueled by the user experience (UX) organization, which has been working with SiteSpect since 2016 to optimize all aspects of digital experience for their customers. Taking a new step in their optimization efforts, the brand turned their focus to page speed. Studies have found that a one-second delay in page load time could decrease conversions by 16%, and the same lag would cost Amazon $1.6 billion a year. The unique deployment method offered by SiteSpect has meant the brand has not experienced the page speed impact typically associated with JavaScript tag-based optimization solutions, and so had not focused on collecting those metrics. Yet, understanding the exact impact of page speed would help the UX team in prioritizing their roadmap and weighing the cost vs. benefit of potential future changes. As a first step to understanding how page load time affects their customers, the UX team decided to run an experiment that would quantify the conversion impact of reduced site performance.

Designing a Slower Site

The team structured the campaign as an A/B split test to be shown to all users. This meant that 50% of visitors to the site, regardless of digital channel (computer, smartphone, or tablet), would experience the A/B test variation of a one-second added delay in load speed for all pages.

To do this, the brand worked with SiteSpect’s optimization team to build out this A/B test experience using its native product capabilities. Rather than actually slowing down the site load time, SiteSpect manipulated the CSS to hide all page content until a one-second timer triggered, after which the full page content was presented. This gives the appearance of a slower loading site, and ensures a consistent delay experience across all users.

The team hypothesized that this change would worsen the user experience and thus negatively impact conversion rates.

Results

It turned out that slowing down the site did in fact hurt conversions, as expected — but by how much?

The variation with a one-second page delay saw a -10.83% (p < 0.01, ~99% significance) decline to purchase conversation rates. When extrapolated to all traffic and all customers, the brand would see an approximate £10,000 reduction in revenue per day. Interestingly, when the data was segmented specifically for mobile devices, the effect was less pronounced and the conversion rate decreased by -4.38%; this was likely because smartphone users experience more delay for other reasons, such as data network connection, and thus are “trained” to expect lower performance. Another discrepancy surfaced between new and returning users. For new users, purchase rate fell by -16.56% at significance 99%, while for returning users the difference was only -6.97% at significance 82%. While these lower numbers are still impactful, this suggests that returning users with some amount of brand loyalty were more tolerant of hiccups in site performance.

“This A/B test was one of our most fruitful and received attention across the company, including from our most senior executives. While we weren’t surprised that page load times directly impact conversion behavior, the specific data and insights that this SiteSpect A/B test revealed gives us tremendous power to predict impact of any changes going forward. Elements like creative assets, functionality, and third-party modules could all add some latency, and this A/B test enables us to predict and mitigate risk, while maintaining an optimal user experience,” says the team’s Senior Digital UX & Product Manager.

Of course, this campaign was not run for a long period of time due to its statistically negative impact. The team had to strike a balance between the benefit of receiving data and generating the insight while on the flip side acknowledging that the campaign was negatively impacting site performance while live. The UX department took a pragmatic approach to this, and maintained the campaign for just long enough to reach statistical significance with their primary KPIs before ending the campaign to minimize the negative impact to users.

It’s uncommon to run a A/B test where you deliberately worsen user experience, especially a A/B test designed and implemented by a UX department. But the new knowledge generated by this A/B test is leading initiatives across the company.

The insights provided will socialize the importance of page speed on user experience to the broader business, and ensure that it is at the forefront of decision making. The brand will now prioritize projects on their development roadmap based on their potential impact on page speed, and on the other side, help to deprioritize projects whose negative page speed impact may overshadow projected benefits.

To learn more about SiteSpect, visit our website.

Share

SiteSpect Product Management

SiteSpect Product Management

Suggested Posts

Subscribe to our blog:

[hubspot type=form portal=7289690 id=55721503-7d2c-4341-9c5f-cd34a928a0dd]