Case Study
Mountain Warehouse Optimizes Sale Pricing
About Mountain Warehouse
Mountain Warehouse is a UK-based outdoor brand, with their own line of competitively priced clothing and gear. Their analytics team has had a strong testing program for a number of years, and experimentation is a driving force behind their digital strategy. Because of the importance of price setting and sales in retail, Mountain Warehouse saw some immediate success with optimizing how they displayed prices — especially discounted prices. They began this optimization process with their former tag-based tool, but ran into a several limitations: They found it difficult to do multivariate testing, which required them to run separate sequential tests; They were unable to test consistently on their Single-Page App (SPA) sites, meaning they couldn’t get all of the data and insights they needed; Finally, because of the flicker they saw with their tag-based solution, they experienced some uncertainty around whether this has impacted user behavior.
They switched to SiteSpect in 2019 and decided to revisit the optimization of their pricing display as they could remove the risk of flicker on the results and conduct an MVT to better understand the interaction between price elements on performance.
Using a Multivariate Test to Identify the Best Pricing Display
Before testing, Mountain Warehouse’s sale product pricing featured the original price in large, strikeout print, followed by a savings amount in pounds, with the new price listed in smaller, red print underneath.
This display has three factors, each with its own set of questions.
- Is the original price too big compared to the new price? Does this perhaps draw attention away from the discount? Or does the physical size of the pricing impact the perception of the saving?
- Does the discount represented in pounds have more impact, or should it be represented in a savings percentage?
- Is the strikethrough the best way to demonstrate the old price? Or is “now” and “was” language clearer?
In this experiment, Mountain Warehouse wanted to find the optimal way to display these three components of their pricing structure. While with their previous tool they had done some preliminary testing, they were unable to test all of the factors in combination easily. So, with SiteSpect, they created a multivariate test to determine the best combination of factors.
How do we best demonstrate savings?
The driving goals of this experiment were to:
- Make clear that the higher price was reduced.
- Calculate the savings for the customer and display this in a manner that is more clearly understood.
- Display the new, lower price more visibly.
- Make the price display as influential on the customer’s first impression as possible.
These goals informed the team’s hypotheses. They suspected that a strikethrough effect on the old prices would be clearer, that savings displayed in pounds would be more impactful, and that making the old price smaller and current price larger would highlight the savings.
They also tested an animation feature. With their previous tag-based tool, they experienced some flicker when testing their pricing displays. While some of their tests were successful, it raised an interesting question. Was the flicker itself drawing attention to the price? If so, then did the other changes have a positive or negative effect on conversions?
Because flicker was a side-effect rather than a part of the experiment, they were unable to answer that crucial question. Once they switched to SiteSpect they didn’t have any accidental flicker, and could instead introduce an animation that they could test and measure alongside a static version.
The Variations
This test included 24 variations that worked with combinations of 4 factors. The 4 factors were:
Factor 1: Savings: Display savings as a percentage, or as a pound amount.
Factor 2: Previous Price Wording: Include “was” language before the original price or eliminate it and introduce an animated strikethrough.
Factor 3: Now and Was Sizing: Make the old price smaller and the new price larger.
Factor 4: New Price Wording: Introduce “now” language in front of the new price.
Running the Experiment and Iterating Along the Way
After 4 weeks, the factor that introduced “was” before the former price was underperforming. At this point, the campaign was optimized by removing this factor and reducing the number of active experiences to 16. This allowed more traffic to flow to the other, more successful variations helping to accelerate the campaign towards its conclusion.
After 2 more weeks, the campaign was reviewed and 2 more underperforming experiences were removed from the campaign in order to direct more traffic to stronger performing variants.
Finally, after another 2 weeks, 4 additional underperforming variants were removed. This left 6 final experiences.
After 8 weeks the campaign was concluded. Of the remaining active test variants, 2 performed statistically better than the control for the campaign’s main KPIs: Adds to Cart and Purchases.
Along the way, the Mountain Warehouse team worked with their SiteSpect Optimization Consultant extensively, strategizing when and how to iterate.
As Mountain Warehouse’s Will Fielding says, “A key part of the iteration process was the help we got from our SiteSpect consultant along the way. As this was our first MVT, the advice we got helped us confidently analyse the test.”
Results
“A key part of the iteration process was the help we got from our SiteSpect consultant along the way. As this was our first MVT, the advice we got helped us confidently analyse the test.”
Will Fielding, Mountain Warehouse
Across all devices, displaying savings as a percentage rather than as a pound amount performed well. With this variation, Adds to Cart increased by 2.95% and Purchases increased by 4.16%. The combination of savings as a percentage, an animated strikethrough, and removing the word “now” from the new price saw an increase in Adds to Carts by 2.64% and Purchases by 3.67%.
Both variations saw significant uplift across just tablet and desktop users. Savings as a percentage, an animated strikethrough, and removing the word “now” saw Adds to Cart lift by almost 9.58%, and Purchases by almost 8.78%. While savings as a percentage alone demonstrated an increase in Adds to Cart by 7.45% and Purchases by 7.58%, both metrics and variations remained flatter on smartphones. This is most likely because the smaller display size makes this test less influential.
In the future, Mountain Warehouse plans to use SiteSpect to determine whether there is a tipping point for when to display the savings amounts in pounds versus as a percentage. For example, is a percentage savings more attractive for lower-priced items versus higher-priced items? Can a hybrid of both be used and at what price points should each be used?