An Interview with Cory Underwood: What Safari ITP 2.3 Means for the Field of Digital Optimization

By Ruby Brown

October 10, 2019

Share

In September 2019, Safari released yet another update to its Intelligent Tracking Prevention (ITP), now on version 2.3. This spells big changes for marketers, developers, and a host of software that depend on client-side cookies. We’ve discussed how SiteSpect fares in the wake of these changes (ITP does not affect us), but we wanted to dive a little deeper into the way these developments might change the field of optimization overall. I sat down with Cory Underwood, Senior Programmer / Analyst at L.L. Bean. He currently oversees Google Analytics implementation and consulting, and previously worked for 7 years as the lead A/B test developer there.

To start off, can you give us an overview about what Safari ITP is, exactly, and what the change is for 2.3?

 

Sure. We have to go back to earlier 2019 to really understand how it’s formed. So, very early in the Spring, ITP 2.1 changed the way client-side cookies are set and what their expiries are, and that has impacts for all kinds of functionalities — not just limited to A/B testing. But basically, any client-side code that needed to maintain state would get a maximum of 7 days out of that cookie before it gets deleted. And then any subsequent visit after that point, that logic would interpret as a new person.

When 2.2 rolled out under mobile devices in May, under special conditions related to the referring domain and link formatting and such, the cookies would only get a 24 hour expiry. All these client-side tools look at first party cookies that are set on their client. Because they’re client-side based, and A/B testing normally runs longer than a week, you’re concerned during the course of the A/B test execution that people are going to get alternate cell assignments and see all the different experiences.

What the vendors did in a lot of cases is started relying on local storage which wasn’t subject to Safari’s protocols of privacy. So they either store the segmentation in local storage, or they do a cookie and recreate it in local storage, or something of the sort. It had its limitations, specifically around cross-domain and things like that, but it basically still allowed the tools to work more or less as they did.

Now with 2.3, which was released with Safari 13, under special circumstances relating to the referrer domain — especially if it’s social — the local storage will also get deleted after 7 days. So, in the absence of having a server-side solution set your state, you’re going to essentially believe that the client is a new person after 7 days or 24 hours because all of the stateful identifiers on the client get removed. To the server, the person is new, even if they’ve been there every day (but 24 hours have passed between each visit). And that’s really not ideal from a A/B testing perspective because you want them to maintain variation assignment.

What is Safari trying to curb? Is it mainly about A/B testing and personalization? Is it more about ads? What’s Safari’s biggest intention here?

 

They’re really trying to target cross-site tracking. And I really think they’re targeting social networks in particular, but with the way that they’re going about it and the solutions that they’re putting in place are not limited to that kind of tracking. They say “Unintended Impact” is possible, and that they might, on a case by case basis, look at workarounds for it. But these things can break. [You can find that list of unintended impacts here.]

So, theoretically, they think all that stuff could break by what they’re doing.

What kind of impact are we going to see in A/B testing and Personalization? How might this throw a wrench in current processes, and what best practices should we be adopting now to deal with it?

 

You definitely need to take a look at your platform and understand how it fundamentally works. How it does state, and if how it works is compatible with all of the recent browser announcements. If it’s not, you’ll need to work with whatever team runs that platform to come up with a solution on whether it is even still viable, or if you have to do additional development to get it to continue working. But you just can’t ignore it — it’s going to show up in your analytics, and on paper everything would be working as you would expect but not as you intend. And as a result of that, you would end up with substantially different data than you would if you had all the context.

Let’s say you have a lot of traffic from Safari but it’s social… so one person comes back every 25 hours, but they do that every day — but always just one hour off. How many users did you have that month? Technically 1, but your reporting says 30. And you have the worst retention rate ever, because you have 0 retention. But new users are knocking it out of the park. But that’s not really what’s happening. You might think, “I can cut back on my new user spend because new users are crushing it, the engagement team is knocking it out of the park”. But that would be a mistake.

So that’s a big problem for data analysis. What other impacts are you seeing on user experience, or in any other fields?

 

I’m going to speak from a generally more development point of view. You wouldn’t necessarily always come back to the server to do state, because maybe the server doesn’t need to know, and you still need to have some sort of long term storage. 

So a good example of this would be GDPR consent dialogues. If they [a visitor] opt out, you don’t want to show that dialogue box to them on every page. So you need the client to understand that, but the server doesn’t necessarily need to care. But, because of the way that this is structured, and the need to preserve state longer than 24 hours potentially, you would have to now have the client call the server, and have the server do something that previously the client could do on its own. So, in order for you to not be subject to edge cases around things that aren’t even within your control, like how they get to your site, you now have to have a different architecture with anything concerning state.  

How should we be adapting?

You have to look at how you set state, essentially. And go through all the code, and look at basically, do I have to do this differently? Do I even need to do it at all?

From that you figure out, okay, well, “I need to set the state under these circumstances so I am going to have to re-architect that.” And that is going to commonly come in from additional development investment. 

And now I also think that’s what’s going to happen is for marketers who are used to doing all this tag and installation, they’re going to have to understand how the tag works because that’s now as, if not more, important than what the tag actually does. And so I think what that could actually end up resulting in is tagging moving to more of a developer focused workflow. And companies are going to have to look at all of this, and figure out “how do I support this?”

But, maybe you don’t have a lot of Safari traffic and you can kind of skip it. But I mean, maybe you’re 50, 60, 70, 80 percent Safari traffic because you get all of your traffic from iPhone users, and you’re going to have to stop feature development to have to fix this if you want to maintain collection accuracy.

It sounds like marketers are going to have to get a lot more technical, or work more closely with developers.

I think that basically is going to be the only path forward because you can’t just add a marketing pixel to the site and expect it to work as you intend.

We not only have Safari ITP 2.3, but also Firefox Enhanced Tracking Prevention, and then also legal intervention like GDPR and CCPA. There’s a lot of discussion in the zeitgeist right now about data privacy and security. How is this changing the field of digital optimization?

 

There’s basically two different concerns here. There is technically, what the browser allows you to collect, and then there’s also legally what you’re allowed to collect, regardless of what the browser says you can or cannot do. And I think browsers are trying to get ahead of it, and at least with Safari and Firefox they’re really using that in their marketing. 

I think my major concern about what the browsers are doing is that now they’re getting out of sync — they’re all doing something different. That’s problematic from a development point of view because essentially you have to have your code either function in equivalence to the most restrictive one, or have some sort of cascade of conditional statements that says “it’s probably this, so I should execute this way.” 

And then, the impact on reporting and personalization isn’t consistent either, but the big thing, and I think this could actually play out, is you just invest in server-side tools to avoid or mitigate the impact. 

Now on the other side of the coin, for the laws, they’re requiring you to build a lot of infrastructure to be compliant… But the fact of the matter is that if you like to do online shopping, or banking, or any of the things, you need at least some kind of cookie to maintain state with the server. So until alternative tech is developed, that’s not going away. 

So I expect the browsers to eventually get more or less on the same page, even though they’re not right now, but we’re heading in that direction. I do think new tech will have to be developed to get there. 

What should everybody be doing differently right now?

 

Stop looking at stuff in aggregate, and start looking at it by browser. But also have a context of how that browser operates, because you’re going to want to know that before you go and start shifting your spending. Otherwise you’re going to be making choices that seem logical but are ultimately based on a flawed understanding of what’s actually occurring. 

You can find more from Cory Underwood at Linkedin.com, where he has more content on Safari ITP and much, much more.

To learn more about SiteSpect, visit our website.

Share

Ruby Brown

Ruby Brown

Suggested Posts

Subscribe to our blog: