Part 5/6: Was the User’s Experience Any Good?

By Justin Bougher

March 25, 2020

Share

Welcome to installment 5 of my 6-part series, “5 Questions You Need to Answer to Optimize an Experience”. In the first 3 blogs, I introduced my approach to optimization, and I walked through ways to answer the questions, Who is the user?, What experience should you show a user?, and How should you deliver that experience? In this blog, I’m walking through how to answer the question, Was the user’s experience any good?

As we all embrace and refine data-driven decision making within our own organizations, we find ourselves more frequently asking the question, “was the user’s experience any good?” A great experience not only drives revenue and engagement in each session, but encourages loyalty and fosters repeat business. Most of us have a variety of data sources that we rely on to tell us about whether a given experience is effective. Unfortunately, much of that data isn’t as straightforward as it may seem, and a mistake could steer your organization in the wrong direction. To combat this, you need to dig in and understand how each piece of data is collected, where it came from, and what it tracks. Here are some key points to keep in mind as you assess the effectiveness of user experiences.

Data Accuracy

The inconvenient truth about most data analytics solutions is that many common methods of capturing data have inherent accuracy issues. For example, many solutions use Javascript tags, which rely on the browser to send information. JavaScript sometimes misfires causing inaccurate data, and browsers can have their own issues that are out of your control. This can lead to several percentage points of inaccuracy — a huge deal for most decisions.
Tools that don’t rely on JavaScript are the best route to ensuring accuracy. If you don’t have this option, then verifying your data against multiple sources and ensuring a large enough sample size will mitigate the risk of errors driving bad business decisions. For example, if you use a CMS that gives you user behavior data, you will want to also collect that data in Google Analytics or another analytics tool. This way, if you notice discrepancies you can diagnose the problem and get more out of the information you have.

ITP, ETP, and What Happened to My Data?

Last year Safari introduced Intelligent Tracking Prevention (ITP), and Firefox introduced Enhanced Tracking Protection (ETP). For in-depth details review our optimization and browser protection tracking page, but in short, both browsers now prevent cross-site tracking, and Safari now “forgets” users after 24 hours. Microsoft Edge and Google Chrome have followed up with their own anti-tracking updates. These updates don’t affect SiteSpect’s data, but you will need to be aware of discrepancies with any other tool you use that collects data.

This means that tracking a user across multiple sessions is more difficult for these browsers and may be modifying your reports. Many organizations are seeing an increase in first-time users and a decrease in repeat visits, but this doesn’t reflect actual user behavior. Again, your data may look normal, but may not be reflecting what has actually transpired. If you are unsure if you have a problem, segment your data by browser to see if your first time visitor versus repeat visit data is significantly different in each segment.

For more information on our approach to browser tracking prevention, watch this webinar.

The wrong experience or the wrong delivery?

Finally, if your data suggests a bad user experience, don’t write it off immediately.
Take some time to verify that it was delivered correctly. We know there is a strong correlation between site performance and conversion. If you notice that a new experience is underperforming, make sure you look not only at the business KPIs but also at performance metrics with Real User Monitoring. If you see that a poorly performing experience is also loading more slowly, this may be what users respond to rather than the experience itself. At that point, you will want to investigate why the experience was slower. It could be a bug, or a limitation of your optimization software. Either way, you need to solve that problem before you can write that experience off as a failure.

Data-driven decision making isn’t always as straightforward as it seems. When trying to understand whether a user experience was any good, be sure to also interrogate your data and how you collect it.

To learn more about SiteSpect, visit our website.

Share

Justin Bougher

Justin Bougher

Justin Bougher was the VP of Product at SiteSpect.

Suggested Posts

Subscribe to our blog: