How to Test and Optimize While Doing a Site Redesign

By Sally Hall

May 14, 2019

Share

So, you want to do a site redesign. Maybe you’re doing a complete rebranding, updating your look and feel for the season, or changing the structure of a single page. Maybe you’re creating device-specific checkout pages or product images. No matter what your redesign entails, you’ll need to know how it impacts all of your metrics, and to do that you’ll need to thoroughly A/B test all of your changes. In this blog, I’m going to walk through best practices for implementing a redesign, as well as common pitfalls to avoid.

A/B Test Ahead of Implementation

The best thing you can do for your redesign is to start A/B testing before you launch your new site or pages. There are plenty of circumstances where this isn’t possible — a brand image decision, for example — and I’ll walk through what to do in that case a little later. However, A/B testing throughout the design process will give you the knowledge and agility for biggest impact possible. To do this, introduce each element of your redesign individually first, then start A/B testing it in conjunction with the rest of your design changes.

A/B Testing in Action

Let’s use the homepage update from Adventures In SiteSpect. You can scroll through the images in that blog to see the changes my colleague Ruby made to an ecommerce homepage, but the changes were as follows:

  • Swap out hero image
  • New hero copy
  • New featured items
  • New Call to action

new seasonal homepage

The goal of the changes in this case was to increase purchases of seasonal items for specific customers. However, there are a number of conversion points on the page, and a number of ways for customers to interact with the design. Since that’s the case, it would be a mistake to group together the entire design and then A/B test it as a whole. If you did that, your results — good or bad — would be misleading and harder to replicate. For example, clicks on the primary CTA button could go down from the control, but that change could be caused by the CTA copy, the button color, the button placement, or something non-button related at all, like the hero image!

A/B testing everything at once obscures insight into what elements helped or hurt conversions.

Instead, you would set up a series of independent campaigns A/B tested against the original homepage. But, it’s not quite as simple as A/B testing each element on its own because some elements depend on each other or only make sense when paired. This is where multivariate testing comes into play.

Multivariate Testing for Codependent Elements

Sticking with the example of a homepage CTA, before you even touch the hero image you should test the CTA copy on its own. The problem is that there are a number of factors involved: copy, color, placement, font, size, style. In SiteSpect, you can create a Multivariate Test where the program will automatically create every possible combination and test them each with equal traffic. Of course, some combinations may be out of the question (ie white text on a white button, or color combinations that clash with your brand guidelines). If that’s the case, you can just turn off those variations. On the other hand, you may only have a few possibilities even worth testing — in that case, you might build the multivariate test manually.

Once you have a winner (or two) for one element, you can multivariate test other elements and start putting them together. Remember, depending on your traffic volume, you can have several of these experiments going at once — one to figure out the optimal CTA, and one to A/B test the hero image, for example. So this process doesn’t need to drag out your design process. In fact in most cases it speeds it up because you have the information to back your design choices.

If you can start this multivariate testing process before you have a “final” design, or test in tandem with designing, then you’ll have the space for a truly data-driven design based on actual user experience.

When You Have to Implement a Full Design

A/B testing or multivariate testing during design is the ideal situation, but there are plenty of cases where you just can’t. Some that I’ve worked on include: a new logo necessitates a new color scheme and font style, a company-wide shift in business focus and goals, a new software acquisition that needs real estate on the website, or many more. If this is you, the biggest mistake you can make is A/B testing your entire new design as one variation against your control.

Even if you A/B test your new design as one variation and it wins, you’ve lost out on valuable information that would drive optimization in the future. Instead, measure each element of the site independently. That way, if there is one element dragging down your metrics, you can at least mitigate the effect. Or, if one element increased conversions, you may be able to implement in additional locations.

A/B Testing in Action

Let’s go back to the seasonal home page example. Let’s say you’re given this completed design and now you need to A/B test and implement it. You should implement those changes in SiteSpect first, and separate each component into separate variations (essentially following the ideal process above). The difference of course is that you have some more limits to what you can work with.

As you create this new design in SiteSpect, you should separate out each element. In this case, as above, you have your hero and copy as one variation, your CTA button (including button copy, placement, and color as a Multivariate test), and if you were to scroll down a new set of featured products.

When you get your metrics, you’ll be able to see which elements helped conversions and how they worked together.

How to Execute a Successful Redesign

When you redesign any part of your digital experience, keep in mind that elements work together, but also individually. If your A/B test results are flat or unclear, can you break your variations into more distinct elements?

To learn more about SiteSpect, visit our website.

Share

Sally Hall

Sally Hall

Sally Hall is an Optimization Consultant at SiteSpect, guiding SiteSpect users on the road to optimization. She has more than 10 years of experience as a web optimizer and testing manager for enterprise brands. She is based in Austin, Texas.

Suggested Posts

Subscribe to our blog: