Keeping Up With Browser Privacy Changes with Cory Underwood

By Ruby Brown

May 20, 2020

Share

In 2019, Safari released Intelligent Tracking Prevention (ITP), now on version 2.3. Firefox then released its own set of tracking prevention policies: Enhanced Tracking Protection. Closely following suit, Microsoft Edge and Google Chrome each announced their own sets of tracking protection features in early 2020. While each browser now has its own rules, the common theme remains: the game is changing around using cookies for tracking.

At each of these turns I sat down with Cory Underwood, current Platform Engineer at Search Discovery and longtime programmer and analyst, to dive deeper both into the technical side of these changes and what they mean for marketers and developers. Below are parts of our conversations on what browser tracking changes mean for optimization and analytics. You can watch the recorded webinar, “Keeping Up with Browser Privacy Changes,” here.

To Start Off, Can You Give Us An Overview About What Safari ITP Is, Exactly, And What The Change Is For 2.3?

Sure. We have to go back to earlier 2019 to really understand how it’s formed. So, very early in the Spring [2019], ITP 2.1 changed the way client-side cookies are set and what their expiries are, and that has impacts for all kinds of functionalities — not just limited to A/B testing. But basically, any client-side code that needed to maintain state would get a maximum of 7 days out of that cookie before it gets deleted. And then any subsequent visit after that point, that logic would interpret as a new person.

When 2.2 rolled out under mobile devices in May, under special conditions related to the referring domain and link formatting and such, the cookies would only get a 24 hour expiry. All these client-side tools look at first party cookies that are set on their client. Because they’re client-side based, and A/B testing normally runs longer than a week, you’re concerned during the course of the A/B test execution that people are going to get alternate cell assignments and see all the different experiences.

What the vendors did in a lot of cases is started relying on local storage which wasn’t subject to Safari’s protocols of privacy. So they either store the segmentation in local storage, or they do a cookie and recreate it in local storage, or something of the sort. It had its limitations, specifically around cross-domain and things like that, but it basically still allowed the tools to work more or less as they did.

Now with 2.3, which was released with Safari 13, under special circumstances relating to the referrer domain — especially if it’s social — the local storage will also get deleted after 7 days. So, in the absence of having a server-side solution set your state, you’re going to essentially believe that the client is a new person after 7 days or 24 hours because all of the stateful identifiers on the client get removed. To the server, the person is new, even if they’ve been there every day (but 24 hours have passed between each visit). And that’s really not ideal from a A/B testing perspective because you want them to maintain variation assignment.

Now can you break down Google Chrome’s cookie changes?

Cookies can either be first-party or third-party based on how they get loaded onto their page and the context. And that can have some unintended scenarios because it enables an attack vector known as cross-site request forgery. So Google is hoping to address that by changing the default behavior.

The current behavior is to always share the cookies in all contexts, unless the programmer specifically sets something for that not to happen. So it’s an insecure/open default. What they’re doing is closing that off by making the default lax, so the developer would have to explicitly change the attribute for a cookie to be available across sites.”

What are Safari and Google trying To Curb? Is It Mainly About A/B Testing And Personalization? Is It More About Ads? What’s Safari’s Biggest Intention Here?

Safari is really trying to target cross-site tracking. And I really think they’re targeting social networks in particular, but with the way that they’re going about it and the solutions that they’re putting in place are not limited to that kind of tracking. They say “Unintended Impact” is possible, and that they might, on a case by case basis, look at workarounds for it. But these things can break.

So, theoretically, they think all that stuff could break by what they’re doing.

With Google, where I really think it’s going to hit is things like third-party ad attribution, and unintentionally it’s going to impact third-party log in flows [using a login from one site to authenticate you on another site, eg using Google to log in to Instagram] — which I know is something they’re working on. The concept is to prevent cookies from allowing you to be inadvertently tracked cross-site, and as a result of that the technology that has to function cross-site either has to be updated or it stops working.

What Kind Of Impact Are We Going To See In A/B Testing And Personalization? How Might This Throw A Wrench In Current Processes, And What Best Practices Should We Be Adopting Now To Deal With It?

You definitely need to take a look at your platform and understand how it fundamentally works. How it does state, and if how it works is compatible with all of the recent browser announcements. If it’s not, you’ll need to work with whatever team runs that platform to come up with a solution on whether it is even still viable, or if you have to do additional development to get it to continue working. But you just can’t ignore it — it’s going to show up in your analytics, and on paper everything would be working as you would expect but not as you intend. And as a result of that, you would end up with substantially different data than you would if you had all the context.

Let’s say you have a lot of traffic from Safari but it’s social… so one person comes back every 25 hours, but they do that every day — but always just one hour off. How many users did you have that month? Technically 1, but your reporting says 30. And you have the worst retention rate ever, because you have 0 retention. But new users are knocking it out of the park. But that’s not really what’s happening. You might think, “I can cut back on my new user spend because new users are crushing it, the engagement team is knocking it out of the park”. But that would be a mistake.

So That’s A Big Problem For Data Analysis. What Other Impacts Are You Seeing On User Experience, Or In Any Other Fields?

I’m going to speak from a generally more development point of view. You wouldn’t necessarily always come back to the server to do state, because maybe the server doesn’t need to know, and you still need to have some sort of long term storage.

So a good example of this would be GDPR consent dialogues. If they [a visitor] opt out, you don’t want to show that dialogue box to them on every page. So you need the client to understand that, but the server doesn’t necessarily need to care. But, because of the way that this is structured, and the need to preserve state longer than 24 hours potentially, you would have to now have the client call the server, and have the server do something that previously the client could do on its own. So, in order for you to not be subject to edge cases around things that aren’t even within your control, like how they get to your site, you now have to have a different architecture with anything concerning state.

How Should We Be Adapting?

You have to look at how you set state, essentially. And go through all the code, and look at basically, do I have to do this differently? Do I even need to do it at all?

From that you figure out, okay, well, “I need to set the state under these circumstances so I am going to have to re-architect that.” And that is going to commonly come in from additional development investment.

And now I also think that’s what’s going to happen is for marketers who are used to doing all this tag and installation, they’re going to have to understand how the tag works because that’s now as, if not more, important than what the tag actually does. And so I think what that could actually end up resulting in is tagging moving to more of a developer focused workflow. And companies are going to have to look at all of this, and figure out “how do I support this?”

But, maybe you don’t have a lot of Safari traffic and you can kind of skip it. But I mean, maybe you’re 50, 60, 70, 80 percent Safari traffic because you get all of your traffic from iPhone users, and you’re going to have to stop feature development to have to fix this if you want to maintain collection accuracy.

It Sounds Like Marketers Are Going To Have To Get A Lot More Technical, Or Work More Closely With Developers.

I think that basically is going to be the only path forward because you can’t just add a marketing pixel to the site and expect it to work as you intend.

What Should Everybody Be Doing Differently Right Now?

Stop looking at stuff in aggregate, and start looking at it by browser. But also have a context of how that browser operates, because you’re going to want to know that before you go and start shifting your spending. Otherwise you’re going to be making choices that seem logical but are ultimately based on a flawed understanding of what’s actually occurring.

You can find more from Cory Underwood at Linkedin.com, and you can watch the webinar, “Keeping Up With Browser Privacy Changes,” here.

To learn more about SiteSpect, visit our website.

Share

Ruby Brown

Ruby Brown

Suggested Posts

Subscribe to our blog: