In this bonus interview with Simon Poulton of WPromote, we go deeper into the concept of digital privacy, digital security, user permission, GDPR, and how this all impacted Simon's wedding day. Simon is Vice President of Digital Intelligence at Wpromote and oversees the Data Strategy & Marketing Science teams. He has worked with an array of global brands to guide their approaches to data-driven marketing including Forever 21, NBC Peacock & Whirlpool.
If you missed it, go listen to our Core episode on Privacy where Simon is one of the guests we interview as we ponder the deeper concepts of digital privacy, and how that specifically impacts digital advertising.
In today's bonus episode, you'll hear the rest of the conversation Simon and I had that we couldn't fit into our full Core episode on attribution as we all ponder digital privacy together.
Simon Poulton is an experienced digital marketer with a passion for utilizing data to guide decisions. As the Vice President of Digital Intelligence at Wpromote, Simon oversees the Data Strategy & Marketing Science teams & has worked with an array of global brands to guide their approaches to data-driven marketing including Forever 21, NBC Peacock & Whirlpool.
In addition to his role at Wpromote, Simon frequently speaks at various digital marketing conferences on topics related to measurement & digital privacy and has been featured by the Wall Street Journal, Reuters, AdAge, FastCompany, AdWeek, Wired, Moz, Search Engine Land & more.
(automated transcript, please excuse errors)
Is Speaker 1 (00:03):
How do we get this data? Why do we have this data? And, and, and what can we do with it? All that I think does come back though to the consumer, having the right to both consent and ultimately monetize their own data so they can have a fair value exchange in this play.
Speaker 2 (00:21):
Welcome to the ZATOWorks PPC Ponderings Podcast, where we discuss the philosophy of PPC and ponder everything related to digital marketing. Are you ready for a masterclass in digital security and privacy today's show is a bonus episode of our full interview with the vice president of digital intelligence at WPromote, Simon Poulton. In this episode, Simon gives a plethora of ponderings about digital privacy, what to be concerned about how the internet works, and even some thoughts on the future of our digital world. If you haven't heard Simon in our fifth official PPC ponderings podcast episode about digital privacy, go give to listen. Otherwise please enjoy our full conversation with Simon.
Speaker 3 (01:06):
Let me start by asking you your name and title, and maybe talk through a little bit of, of like what that means in terms of privacy. You know, why you speak on it a lot? Yeah. Maybe take, take it away from there.
Speaker 1 (01:17):
Absolutely. Uh, so my name is Simon Poulton. I am the vice president of digital intelligence at, uh, WMO. I've been with WMO now for about, uh, seven years and, and I've always been just really engaged by the, the data, uh, space. Uh, so while I originally, uh, started working in, in various different areas of digital marketing, I'm sure like many of your listeners, you know, starting with a lot of AEO moving into PPC at a time when it was still novel and, uh, very forward thinking to say, I think there's money in paid social, uh, really jumping on, on those bandwagons, but all all through that period, I was becoming more and more aware of the challenges associated with measurement for, for brands and what that really meant, uh, in terms of how they made spending or investment decisions. And I think we saw this transition from going from this, this incredibly simple world of, of last click when it was just paid search.
Speaker 1 (02:10):
And we, we, we got a lot more complicated when the world of programmatic and, and paid social and just all these other new mediums, uh, came, came to bear. And, and we really had to, as digital marketers had to stop thinking about clicks and had to start thinking about the overall influence that we were creating, but that's a lot that's much harder to, to, to measure. So, uh, I, I created this team here at w America called digital intelligence. And our primary focus is to think about data utilization. There are so many times where we as digital marketers share data. That is interesting, but not useful. I've been on lots of calls where folks have said things like, Hey, um, add the cats on this pair of shorts are up 5%, uh, week over week. What do I do with that information? I, I, I don't know, is, is there anything useful there?
Speaker 1 (02:51):
And we've also seen a lot of data hoarding happen in our space. And, and one of the, the key components that, that I, I, I keyed onto pretty early is how do we get this data? Why do we have this data? And, and, and what can we do with it? And, and as I was studying that, and, and really going through a lot of the, uh, the hows and the whys behind, uh, data utilization in the digital mapping space, we also saw a simultaneous push from Google and Facebook on the platforms to really invest more in, in the machine learning side of things. And, and these optimization signals. And, you know, I, I dunno if keywords ever go away at any point, but this idea that your levers, uh, heavily diminished a lot of that really does come back to this idea of data feedback loops and how you're really, to really able to, to engage that engine.
Speaker 1 (03:34):
So I was going through a lot of this. I started looking at different privacy, um, solutions out there. And of course, I think the big one that everyone's everyone's aware of is GDPR, of course, coming, I think actually it's effectiveness. Uh, the, the, the, the date that it came in was Memorial day, uh, weekend 2018. Uh, and the reason I remember that is cause that is my wedding anniversary. Uh, that, that was actually the day that I got married. Uh, and I had everyone in their mother reaching out to me going, Hey, GDPR, what's up with this? And the reality is, uh, I'm not a lawyer. Uh, so, so I, I'm not, I'm not gonna give legal advice, but I am someone that looks at what changes and trends we're gonna see in the industry, uh, and what impact this will have on, on what we're doing.
Speaker 1 (04:13):
So of course, GDPR was a, was a really big, fundamental shift in the idea of who owns this data, who has the permission to collect data? What rights do consumers have over how you use their data after it's been collected? Or if you share it with other groups, if you augmented, if you enrich it, we obviously haven't seen anything quite like GDP here in the United States. Uh, but we did CCPA. That was a, a, a really big move. And so obviously I'm interested in the legislative side, but I'm primarily interested in the technical side. I do believe by the way, there are three buckets here. So we, we have a, uh, a regulatory, uh, legislation. We have our technical, those are policy driven or function driven changes from companies like apple and Firefox. Uh, and then you also have the third bucket, which is the consumer sentiment.
Speaker 1 (04:54):
And I, I, I can certainly touch more on that one later, but the technical bucket, I think, is the most interesting and exciting because it it's really where we're seeing the biggest change in the biggest movement. If you believe like I do that, we're unlikely to see any degree of federal level legislation here in the United States. In, in the next few years, then you have to look very critically at the changes that are happening more in the typical space. So that's where I started really reading a lot more about John Welander. He's the architect behind intelligent tracking prevention. This is really where apple started their crusade against cross browser or, or, or really cross platform identity solutions. Uh, so ITP was a really big move that we saw. That's been echoed by other browsers in the space with, you know, Firefox has enhanced tracking prevention, ETP, and, you know, then there's browsers like brave that are just like, don't even exist on the internet kind of thing that you can't even see those individuals in Google analytics.
Speaker 1 (05:45):
Uh, and, and, and then beyond that, we stopped getting into the, the wide world of apple policies and what that means. And, and I think what, what becomes really interesting here is how they're interpreted, because they're not regulations in a sense, but, but it is a regulation because if, if Facebook's being told, uh, yeah, you can't use this data, um, they have to interpret that in some way, is it, is it a universal, uh, uh, view of the world that you have in terms of they opt out on iOS device? Does that mean no desktop tracking or, you know, there's a lot of, um, interpretation here. And, and I think that's, what's really interesting about where we're going right now and really, how do we think more critically about the ethical use of data within digital marketing initiatives? And I apologize, cause that was such a longwinded roundabout story to get there'll narrative
Speaker 4 (06:31):
That's right. So, no, that was fantastic. Especially three buckets idea, um, regulatory, technical, and consumer sentiment. Let's pull back for a second and look real big picture because sometimes, you know, especially us, us real technical, uh, minded people, which I I'm being very gracious to myself by lumping me in with you in that. So, you know, all, all that being said, um, for, let's say brand owners, listening to this podcast, you know, someone who maybe who runs cross online, right. Even I think the word data thrown around means different things to different people. And so when we're talking about data that is being tracked, can you give, can you give maybe a definition of like, what is that data? And, and maybe even like, what is it used for in a generalized sense, because obviously there's gonna be different ways of looking at
Speaker 1 (07:27):
That. Gosh, that that is a ginormous question.
Speaker 4 (07:30):
Speaker 1 (07:32):
So I think it, I think we need to start by thinking about data ownership to begin with. So before we even define what type of data's included, we should talk about the ownership structure. So historically we've always talked about predominantly first and third party data, first party data being data that has been explicitly shared with you that, that you collected directly from a consumer in a, in a, in a one on one relationship. Um, now that may not be an active process of sharing, right? It might not be just an email. Uh, it may also be, uh, the passive sharing of data. When I'm on somebody's website, there is a cookie, uh, that is now placed within my browser. I didn't choose for that cookie to be there, but it's just the nature of doing business. And, and, and you can kind of think it, uh, think of it like when you walk into a store and there are security cameras, the reality is when you walk into a store, you have, uh, said, I will abide by your terms of service, which is you are allowed to track me while I'm in the store.
Speaker 1 (08:21):
In some ways that's a, it's a, you know, there's a loss prevention side of things there too, but it's also a shopper optimization experience side of things. So, so that, that's really the way we should think about first party. Then third party is data that is owned by another provider. And, and, and really this is where things have to get complicated, uh, because there's all kinds of data sharing agreements that exist between different companies. But when Facebook is, is when a user is on Facebook and they're interacting there, that's Facebook's first party data that they have on that user, they then will let you use that data for targeting within their ecosystem. It doesn't mean that you've been exposed or that you're utilizing third party data necessarily because you don't actually have that. You don't have access to what that data is. You just say, I would like to target people who are interested in this Facebook looks and looks and sees who they can target.
Speaker 1 (09:02):
We then also have second party data, which is a little, uh, nuanced, but essentially this exists in, in data co-ops that maybe brands that are owned by multiple, uh, you know, if you've got a PE group for example, or you've got some kind of, uh, cluster of brands that are all very close together, they may have some kind of data sharing agreements, certainly large companies like Comcast and Verizon have multiple subsidiaries where they have some degree of data sharing. That's all second party because, uh, it's not truly third and it's not really first. And then foresters tried to create this term called zero party data, which is essentially just the augmentation of first party. So it's the idea that, that consumers are giving you, uh, additional insights into their life, like a quiz where you say like, yes, I'm, I, I like going skiing in the winter.
Speaker 1 (09:44):
Uh, and, and those kind of things that you couldn't necessarily apply from someone, I think of that as first party though, still cause, cause that's all about ownership structures. Then when we get into it, there are really two classifications there's essentially PII and non PII. Uh, and you can split some heres on what, what the differences here are, but the, the definition of PII has been growing. Uh, so historically would've, we would've looked at this and said, what is PII things that are directly representative for me? So my name is Simon. That is that, that is my, uh, identifiable information that, that, that someone's able to have on me. What about IP address is IP address considered PII? My wife and I have the same IP address. In fact, when we're at home. And in fact, I am in an office building right now, uh, where I'm sharing an IP with about a hundred other people that is in my view.
Speaker 1 (10:28):
That's not PII, but it is class that way by GDPR. Oh, sorry. By, by CCPA. And then on, on top of that, you can have different attributes to the store with an individual, their transaction history, behavioral history, what do they click on, on a website? What purchases have they made historically? Where do they live? If there are any interests that can be augmented there that's really the scope of data. We're talking about PII and behavioral intelligence combined with any kind of augmentation. Now, this is where it starts to get really tricky. Cuz a lot of brands have historically, uh, utilized some degree of third party augmentation provider. And what I mean by that is if, if you come to my website and you give me your email, we're now in a, uh, uh, an agreement that I, I have access to that email and I can use it to, um, share information with you.
Speaker 1 (11:11):
Speaker 1 (11:50):
And that, that's a very interesting, uh, augmentation of, of, of third party data coming into a first party realm. Now a lot of that's getting stamped out because that that's really quite unethical. And this is where we're seeing the formation of clean rooms moving forward. So clean rooms are a really interesting space like Google's, uh, ads, data hub, Facebook's advanced analytics. Uh, the big, the big joke in MarTech, uh, land right now is as soon as, uh, Netflix announced, they were gonna have ads on their platform. Everyone's saying I'm so hyped for the Netflix clean room. Uh, and, and, and you know, every platform's gonna come out with some degree of clean room. And the idea there is that you can have a privacy, resilient environment where PII is not exposed, but you can do that degree of matching. And I can see at an augmented level that I have a thousand people in this, in this group that I'm targeting that all like star wars. Uh, and so you I'm, I'm not gonna be singling you out as, as Kirk. And I'm also not gonna have that information directly on you. Ill just have an aggregated view of that'll be privacy checks and play to mitigate any, any, any degree of individual identification there. Sorry, I don't tangent.
Speaker 4 (12:50):
No it good because you actually hit on some other things I wanted to kinder as well, ownership, that's Mann for a consumer sentiment, all this, I'll just kind lay a little bit of my and bias a little bit, and then I'd love for you to correct or hop in or how, you know, however you think about this stuff. So consumer sentiment right now seems to be either maybe you have some people who don't care, but overall it seems probably consumer sentiment is everyone's kind of waking up to the fact of like, oh gosh, how is my data being used? I don't like that data. I like to, I don't want, you know, Walmart to have my data, right. And so there's kind of this like uproar about that driven in part. And at least certainly helped by the whole apple and ITP and all that.
Speaker 4 (13:45):
And yet I think that consumers aren't fully aware of a few things, like for instance, this whole idea of data, especially being used in advertising stuff is not remotely new of just, just a sort of an idea. It's, it's been, it's been used for a long time. And then as you said, just this whole differentiation between PII and non PII. I just, I just have to wonder if consumers are looking at the internet and they're, they're, they're not fully understanding that in order for them to utilize this ecosystem, like somehow something has to pay for the content for the, you know, all of this stuff, the products, all that stuff. And that's a big part where like, targeted. So I wonder if consumers are really concerned about if we could, if we could kind of figure out what they actually concerned about. I, I wonder if it's more on the PII side, but I feel like the non PII side has been started to be lumped into that more of the, like the demographic, behavioral traits, things like that. But at some level, as long as those things, aren't problematic, like, you know, you get into things like gender and race and, and, and, and all that stuff, religion, that sort of thing. That's where things get problematic, where someone might use that information against someone to harm someone, things like that.
Speaker 1 (15:02):
Right. What's your definition of, of problematic, because there are things in my life that I make problematic that you don't. Um, and, you know, I think exactly there are these basic ones that we, we can all agree about, but I'm a big and tall guy. And I like to look at big and tall shops. I don't necessarily want that data used to openly target me that I should be wearing, you know, much bigger shirts than the average, the average male consumer. But, but that, you know, so sorry, but to go back to, sorry, you were still asking a question, but I've got some thoughts there.
Speaker 4 (15:30):
Yeah, yeah, yeah, totally. And, and at some point I'll land the plane on that question too, but, but some of that was just me wrestling with that idea. In some ways consumers, I think are like, no, we want privacy. So, so in some ways what I'm hearing is consumers basically being like, we don't want you to ever have any information on us at all. And on the other side, there's some level of like man advertising, like for someone to know that I like star wars and Legos, and therefore to serve me an ad about star wars and Legos, that's actually a, a positive thing for me personally, based upon those aspects of me. Right. And so like, is it the end of the world that, that is at least somehow shared somehow anonymized, but still usable by advertisers? I'm just not sure that that's actually a negative thing, but I feel like consumer sentiment right now sees that as a negative thing. And I'm wondering if like, somehow, do we need to figure out how to separate those two things? Are, are those two things not able to be separated? Yeah. So I, I Don know some less of a question, more of just kind of some thoughts on that and please take that and do something with it. <laugh>
Speaker 1 (16:32):
Yeah. I, I think you're heading on a lot of good thoughts there. I think the first thing though, that we need to do is frame up the question of what is privacy. And, and that probably sounds like an overly redundant question. It's very important that we think about privacy, uh, and security as being too independent things. So I believe apple had an ad on TV that they, they were running, talking about the privacy, uh, that exists with, with on, on the iPhone. And one of the components of that ad was somebody in a park going my credit card number is 1, 2, 3, 4, 5, 6, et cetera. That to me is not a privacy issue. That is a security issue. There is no company that I know of that that, that would ever be tracking credit card numbers, short of credit card processes, because that is their function. And they all have secure functions in play there.
Speaker 1 (17:18):
And, and, and I put this out there because I think the narrative has been very, very much driven by apple. I think apple is, is, is the, they are doing it by design. They are creating a message that privacy is incredibly scary, and you should be afraid of any piece of data that you have out there in the world. There's also these incredibly, uh, scary stories that you see in the media. Things like, uh, Cambridge Analytica is one. And when you really have to dig in, there you go. That actually wasn't a privacy issue so much. It was a security issue once again. So we agree we should have secure means of how these things are being handled. Uh, same, same time. I hear any data leak. I, I, I look at that again and go, that's not a privacy issue. It's a security issue.
Speaker 1 (17:59):
Then you also hear these incredibly sensational stories and for what it's worth, I do not believe that this ever happened. Everyone in the media reported that there was a, uh, a young lady who, uh, was pregnant and she was unaware of this pregnancy. Sorry, maybe familiar target exact right. So everyone knows that story. I trace it back to its origins. It was an anecdote from somebody in an interview with the wall street journal years ago. That said, imagine if and that story caught on, because it's so scary. If, if a company can use their data to know you better than you know yourself to predict pregnancy, that's incredibly scary. But the reality is, that's just, that's just not true. It it's something that one, there was never any actual data that backed up this happened. Uh, and two, we in the data space try and predict things all the time.
Speaker 1 (18:50):
And we're fraught with error. One of the worst things you could possibly do is to predict somebody is pregnant, who is not, and send information about that, right? Like a million things. You could have a miscarriage, it could be a, a baby under bad circumstances, right? You could, uh, you could put somebody in danger because, uh, uh, whoever's in their house. Doesn't want them pregnant. There, there are all these reasons why you wouldn't do that. And, and I think that's where a lot of these stories originate is from a place of fear when we are really thinking about privacy. I think there is a, a, a very important piece of this that is, uh, focused on data sharing. So it's less about security is incredibly important for what it's worth. I am a security advocate all day long. It's the data sharing piece that, that that's the challenge.
Speaker 1 (19:30):
And so this is where you see, um, I think two major groups out there. I one, I will call them the data zero group, and those are the most hardcore privacy warriors out there in the space. And quite honestly, some of the loudest voices, they're the folks who are really heavily involved with things like CCPA and, um, apples, uh, web Kitt and so forth. These are the people who, uh, framing this up as no, you shouldn't be able to track any data on me at all. And that's a, that's a challenge because part of, part of why we track data is to create better user experiences. But I agree there has to, we have to step back from where we are today and there has to be a middle ground. So we have to be very, um, responsible and ethical about the scope of data that we are, uh, tracking.
Speaker 1 (20:10):
We've gotta have a reason and a purpose for it. You also need to have a strong policy in terms of retention structure. How long do you plan to hold onto this data? Four, uh, what, what do you plan to use this for? And, and the most important piece is, are you gonna be sharing with third parties? And so when folks start to get really concerned, and as you touched on before thinking about, uh, uh, the utility of, of, of, uh, or the value exchange for, uh, digital utility, uh, I will absolutely share my address with Google maps. It's incredibly valuable in my life. And there's a lot of value there. Well, does that also work for Facebook in many ways? It does. I will absolutely share my data with Facebook because of the utility that I received from their platform. And I think what, what this really comes back to is that consumers have never really had a way to monetize their own data.
Speaker 1 (20:51):
And, uh, one of them, I think more in interesting things from, from Google that we just saw come out at GML is the ads. And, uh, forgive me, you, you may remember the name, it's the ads, uh, uh, selection center in, in that a user can go in and, and say, yes, I'm in market for this, or I'm in market for this. I think what's particularly interesting is, is the way that we've seen, um, one data portability standards under GDPR that opens up a whole door for, uh, me as a consumer to be able to take my data and say, okay, this is the day that Domino's pizza has on me. I take this over to pizza hut, they'll gimme $10 off. That's a really good value exchange because now they get my history on the pizzas that I like to order. And I like to order on this frequency, uh, and I also get $10 off my pizza.
Speaker 1 (21:30):
So, so that that's a really good deal. And so what we really need to look at here is, is the value exchange for a consumer appropriate for the amount of data that they're willing to share. And does the consumer have confidence in the way that that data will be used and shared? And I think what's, what's the pro, uh, or, or a big, uh, big problem with this is Facebook and Google seem to be getting the brunt of the attacks in this when really it's the bad apples. It's the, it's the third party data it's that when John Oliver was about this week tonight, he was talking about some really, really bad stuff. And that's really ugly that ruins for the rest of that, that, that has created a problem in the industry. And when you do not have any regulation, when you do not have any guidelines around these things, people will go rogue and they will take things into their own hands. And that is a very real problem that we need. We need to be very cognizant about how do we address. And I think we do that through being forthright, open and transparent with consumers. And regardless of what the regulation says, always looking for consent consent is the number one thing we need to be focused on.
Speaker 4 (22:31):
Let me ask you, you noted two major groups ID like to get those maybe listed out. You mentioned data zero. I don't know if you noted the second one you were gonna talk about what are the two major groups in terms of privacy,
Speaker 1 (22:43):
The there's these, these data, zero groups who, uh, the, uh, the ITP advocates, the ones who are saying, there will be no degree of tracking on me at all. And then there's the more moderate group, uh, which I would, I would, uh, fall into where we are very aware of the ethical concerns and considerations associated with data utilization. Uh, we are very mindful of what this means, and we are advocates for putting in place, some degree of, of, uh, of guidelines in terms of data utilization. But the problem right now is that the data zero folks are the ones who are there that they're winning the day, right? And so when, when you look at things like Apple's, uh, uh, app track and transparency, they forced a prompt on every app, develop out there to say, will you consent to being tracked? And, and, and that's a really harsh message to give folks it's reality, but when you can inform a user through the value exchange, and at least give them that line of sight as to this is why, like, you don't wanna see the same 50 ads for car insurance, you don't own a car, right?
Speaker 1 (23:42):
Like, like these are things that are quality of life measures. Uh, and also the reason why a lot of things on the internet are free and we've now grown up in the internet free generation. And it would be a very scary, uh, scenario if we had some degree of, uh, you know, if, if, if all these functions on the internet, if you had to pay to use them, some folks would love that, but that's a really elitist view of the world. The internet is for everyone. And if you are, uh, barely making a buy and you're on the minimum wage, you still have the right to use Google maps. You still have the right to do those things. And, and the fact that, you know, they're able to provide that through through advertising support, I think is, is one of the, be most beautiful things about the internet and that it has democratized access to information and function and utility. As soon as you start to say, uh, advertising is not gonna be as functional in, in these environments, then you, then you put a cost in front of it. That's maybe fine for me and you, but that's not fine for, for many, many millions of users out there in the world.
Speaker 4 (24:36):
Yeah. It definitely is a talking point to be able to sound like you're protecting everyone's privacy. You're, you know, I'm data zero. I sound amazing. It's I don't know if you're familiar with the phrase, like bumper sticker, ethics, but just this idea that like, you know, the reason why bumper stickers are actually not really that helpful is because so often they, they define something in ways where it's not usually, it's just usually more complex than what you can put on three words, right? Not always, not always, but a lot of times something is more complex and it's difficult to be able to communicate to someone like, like you give them the option of, Hey, you know, do you want, you know, bad people like John Oliver talked about to track you around, or, you know, do you wanna be anonymous online? Well, between those two options, duh, who's gonna, you know, who wants to be tracked, but if it's, if it's able to be phrased exactly like you are noting in, in a way that's much more complex in terms of, Hey, do you realize the way that the ecosystem that is the internet works is in this form of, you know, exchanges, value exchanges.
Speaker 4 (25:43):
And so part of what you're doing is like, you're all for supporting those small businesses in that, you know, do you realize that that actually doesn't help them to remove the data from them and blah, blah, blah, the data that a is actually more privacy than security, which by the way, that is the first time I've ever heard that kinda, that distinction between privacy security and maybe that's more a thing of me not having dug into that.
Speaker 1 (26:04):
It's, Apple's amazing at marketing and have completely changed the narrative because they are like, that's the thing as apple is so good at marketing. And that is the, I think a large part of the problem is that they have singlehandedly almost defined what the narrative should be. And we do lump them together as a society. And the way we talk about these things, absolutely privacy, not about security, but security by far and away is my number one concern in the digital ecosystem.
Speaker 4 (26:27):
Yeah. And, and of, I mean, of course I'm familiar with the two things. I don't know if I've ever considered them in this context.
Speaker 1 (26:34):
Speaker 4 (26:35):
In, in that way, in, in, in like a data context, there's privacy, there's security entities, those two are different. Exactly what you said said, they're, they're just conflated all the time.
Speaker 1 (26:45):
Speaker 4 (26:46):
Interesting. Okay. No, this, this is fantastic. I mean, this is, yeah, this is kind of what I was hoping. Okay. So let me, let me throw this out at you. Okay. So if a customer email list is hosted on MailChimp, right. It's accessed by HubSpot.
Speaker 1 (27:06):
Speaker 4 (27:07):
Downloaded by a marketing agency, uploaded to Google ads as a customer match list.
Speaker 1 (27:13):
Speaker 4 (27:14):
So which of those that all have a copy on their servers who owns that data? Do all of them, or is it just, is it the us, you know, the user's email address, like what is happening to that data?
Speaker 1 (27:31):
Yeah. For what it's with, you might wanna consider having a privacy lawyer as your third guest. Uh, and, and I, and I put this out there because it is, that is a very complicated question. And it's one that I don't think has a very clean answer. Yeah. Uh, the reality though, is with any platform that you're using, there is a degree of there, there is a line between HubSpot and anyone who works at HubSpot being able to go and look in your account and, and find Kirk's email address in there. Right. That, that, that that's not a possible line that you can go across the agency that's involved. There, there would be some direct PII exposure, but you could also make the case that they're acting on behalf of the client who does have the first party relationship there as a service provider, taking their data into the platform.
Speaker 1 (28:16):
I think by the way, in that scenario that you just isolated, that's an incredibly risky practice. I, I, if, if folks are listening to this and they take one thing away from today, it's not store plane, text, email addresses anywhere. <laugh>, it's, it's not, it's, it's a liability from, from a, you know, you did have that direct exposure. So if anything came, came from later on like a fine or anything like that, that's a problem. Uh, but it's also a security problem in that your email may get hacked and you are now liable for what happens to those email addresses. So that's where we need to think very critically about hashing and data onboarding providers. So this is where I think, uh, very, very highly of the, uh, of the customer data platform space. So for what it's worth, if you ask, uh, if you ask different people in the, in the MarTech world, or even the CDPs themselves, what is a CDP you're likely to get a, a, a hundred different answers and, and it's, it's, it's, it's perfectly normal for it to happen.
Speaker 1 (29:04):
When we see a novel technology come to play for what it's worth, it's not entirely new. I think when we obviously there's a CRM, which is the OG there, there's DMPs, there's marketing automation platforms that have tried to do a lot of this over time. But what we're really thinking about with the modern CDP is this idea that you can be, you can have a central ingestion location for all of this data. You can have privacy protections in play that mitigate anyone from seeing the actual data behind it. So you'd have pseudo anonymous IDs for anyone who's maybe querying the data, for example, uh, and then you will have, uh, hashing and event forwarding functions within the CDP that allow you to bring that data into an advertising environment in a very private and secure way. That means no one at any point along the way was exposed to that PII.
Speaker 1 (29:47):
Uh, and it also means that there's no way I can de-identify it during that process. So I think what's really interesting about this is one, you know, one of the things in the privacy sandbox, uh, proposal that Google has is this mitigating in browser identity connections that has, that could have a very profound effect on the industry. That's essentially what live ramp does, right? They're a data onboarding partner. There's lots of need for data onboarding out there, whether, whether it's for targeting or in, in many cases, suppression, uh, there's a lot of times where you don't wanna be advertising to people, whether they've op you know, they've, they've tried to opt out of something and you <laugh>, and, and they're asking not to be targeted. Those are all very big considerations. Uh, and we, I think we need to think very critically about how do we onboard and how do we take that data in different places
Speaker 4 (30:30):
That was, yes. I, I, I appreciated the way you started it, which basically like, listen, I'm not a lawyer go find one. Yeah. I mean, that is probably a key part that those listening will need to be aware of is more and more owning a responsibility that I, I don't, I, I think is still pretty new in our industry. And so our industry I'm speaking kind of in the, in the digital agency, world, digital advertising world, right.
Speaker 1 (30:57):
Even if you are working somewhere, there's still a concern with regard to PII exposure internally. That is a very real problem. If someone's acting as an a, as an agent of that company, I still don't want this random marketing manager being aware of my name and email and phone number and all these things that could have potentially bigger effects downstream. I'm okay though, when my identity is synced between different environments, because I know that was not an actual, so like, you know, when, when you're uploading data to Facebook, it gets Purg right away. They're just doing an identity sync because they've already got your other data where they're just saying, yep, we've matched that individual. They're not creating a new identity from that. If, if they can't match it, they delete it. That is a really important, uh, uh, uh, component of this as well.
Speaker 1 (31:40):
Uh, and I think speaks very, very much to this idea that if you've got data that's just sitting around or you've got all those legacy data, and you're not gonna use it, you should delete it because it's just a, a, a liability sitting, sitting right there in the middle of your marketing team and for what it's worth, I think we are gonna see a lot more companies need to bring in a, a marketing compliance officer. I think as regulation ramps up, I think especially CPR a, as that comes in, uh, there is gonna be all, all types of scenarios where a user may request deletion of their data, and that needs to be handled by the marketing team. And, and moreover, you need to understand, well, where have I sent this data? Where have these identities things gone to, and that's a very complicated process. I think it's, it's easy in the early days of, Hey, we got one, uh, optout request today. We can process this, but what about when we're years into this? And there are automated tools out there that users can just go and put their email into an auto request, being removed from everything, then suddenly you have thousands of requests and that to manage that becomes incredibly complicated.
Speaker 4 (32:37):
Hmm. Okay. So, uh, switch gears a little bit. Google analytics. So Google analytics is moving from univer universal analytics. You know, we've been given the deadline of GA four is coming. I have heard conflicting things about the role that this change. I, I, I've heard conflicting things about Google making this change for reasons like privacy in that mm-hmm <affirmative> can you walk me through like an event based tracking as opposed to a session based tracking entity is, is, is that because of privacy reasons? And if so, how?
Speaker 1 (33:17):
Yeah. So there is a component of privacy in there. Uh, what it really comes back to though is the scope of data deletion, um, capability. When we're thinking about the current GA three or universal ecosystem, when you would have to go and try and delete data, you would delete all of the data for that entire day for everyone across the board, that is obviously a challenge for brands to manage. And it also is an unnecessarily burden and hurdle where folks go, oh, let's not worry about it. Uh, and, and that that's, that's quite dangerous, right? With an event based approach, we have, uh, essentially individual isolated, siloed data, uh, sets that can be pinpoint removed. So I think it's, it's, you know, instead of taking a very broad, uh, movement of, I'm gonna have to believe everything in here, so I'm not gonna do it cause there's too much, there's too many reasons I need this data.
Speaker 1 (34:05):
It now gives you a lot more pinpoint accuracy there. So I think that's one component I other is that IP is now a default. And that is an important piece of this in terms of how we're looking at individual identity resolution. But the reality of all this is the Google signal still exists inside GA four. So Google still has a lot of line of sight around who are these individuals, they're linking them to their own identities. You could even make the case that, you know, Google has so many first party properties out there. Uh, and they clicking so much data there that when they bring that in, they have, they have those identity things. That's no different from how we've been operating on a world with the Patty cookies. It's just one that does more critically advantaged Google and their ad products. As a result, the reason they move from GA three to GA four predominantly, and quite honestly, why it was a bit of a rush job, uh, is because they, uh, the amount of storage and processing fees that Google currently assumes with GA three is ridiculous.
Speaker 1 (35:01):
People leave Google analytics accounts on websites forever. They click so much data. There's no purging of that data. Historically, you, if you had an account for eight years, been running universal analytics and you still are clicking data, but never using it. This is Google's way of saying, okay, we're gonna, we're gonna cut down our server costs pretty significantly. And we're gonna move to this new structure. It is in the name of privacy because we also have a, uh, uh, a structure in there that allows for you to say, how long will the data storage Highline be? So you can select that. I think default is three months, I think, uh, uh, it's 13 months, if you, if you change it 13 months. And I think for GA four users is it it's longer, but really they're also just trying to make the case that folks should be moving to big query anyway.
Speaker 1 (35:43):
And that is a really, really big learning curve in our space of going beyond the analytics platform. But it's something that I'm very excited about, uh, largely because I know it gives us a lot more, uh, functionality and for what it's worth, I've always been a firm believer that analytics platforms, uh, create bias. And what I mean by that is they create bias in the way that you look at data, they serve you a report and say, this is how this report works. This is how these dimensions and metrics should work together. And so you are always forced to look at things in a certain way, and that creates some, uh, views like, oh, is, is bounce rate important? And, and the, the reality is no bounce rate is not important. Uh, engagement is important, but the way Google analytics had framed up reports, it put bounce rate front and center on a lot of different reports.
Speaker 1 (36:26):
So a much better, healthier way to go about this is to have a menu of dimensions and metrics that you're able to query in, in a big query environment. I do imagine there's, uh, there's gonna be companies that come out in future years that essentially just create a big query instance for you where you won't need to know SQL, you know, a codeless environment where you can just select from a menu of, here are my metrics. Here are my dimension's kinda like a customer report in Google analytics. Uh, but the whole idea there being that they've got UNS, big query data that, that you can utilize. Uh, and then you can also potentially join that with data in your own ecosystem because they have, uh, additional identifiers. Cause you can your own user identify to, to those individuals as well. And GA
Speaker 4 (37:03):
Hmm. Interesting. Yeah. So them, so them announcing that, oh, by the way, universal analytics, then we're gonna be actually deleting all those reports. Right. All of that, all of that data, that's, that's a feature, not a bug for them.
Speaker 1 (37:16):
Speaker 4 (37:17):
They're getting all of that service space back. <laugh>
Speaker 1 (37:20):
I know I'm in the minority on this, but there's so many folks going like, but what are we gonna do? How are we gonna have this data from so long ago? And, and, and the first that comes to my head is how often are you going back beyond 24 months? Uh, I'm never going back beyond 24 months. The only scenario as I do that with, uh, uh, things like predictive lifetime value, uh, modeling and, and, and these things that are all offline by default when it comes to the web, it's a, it's a completely different ecosystem now. And then, uh, it's also just a it's it's, it's a, I don't know, it's a, it's a, it's a Hoard's mentality. It's something that's, I, I, I have this, I'm not using it, but I, I want, I, I know I'm gonna use it one day. I just don't know when.
Speaker 4 (38:00):
Yeah. That's, that's fair. I mean, I like seeing how much my blog page visits have increased, but whether or not that actually like tells me something actual I can do, as opposed to making myself feel good.
Speaker 1 (38:12):
It's interesting, but it's not useful, right? Like year over year. Well, sure is value in that.
Speaker 4 (38:17):
Speaker 1 (38:18):
But if you look at it and you say, oh, look how much we've grown in four years. And you're like, yeah, well, four years you'd started the website.
Speaker 4 (38:25):
Speaker 1 (38:28):
There's so ways interest useful.
Speaker 4 (38:35):
Interesting. Okay. So Google marketing live just happened and they announced on device tracking on device conversion measurement. So as part of kind of their privacy initiatives, right. Can you maybe walk us through what, why the on device aspect of that is important for, for Google? Why they're changing to that?
Speaker 1 (38:57):
Yeah, I think on device learning is on device learning and, and, and on, on device identification of, of individuals is a way that you can essentially mitigate the transfer of data during these processes. So I think that's what a lot of this comes down to is how much data is actually being transferred back to Google versus how much is being stored on an individual device. And then once a, uh, a conversion action occurs or, or, or a, um, you know, an interest has been, has been, um, identified for an individual then that becomes available. It's more about the rock solid lines of privacy that they would want to maintain there. Quite honestly, it's also advantage in Google once again, because it's something that they can use it in the Google networks, uh, that other platforms may not have access to. And I think that's, we haven't talked about it today, but, you know, I think there's a really interesting other side of this too.
Speaker 1 (39:43):
Cause we talk a lot about Google. We talk about wall gardens and Facebook and TikTok and you know, all, all these behemoth where they store all this data inside of it. But what about the broader ecosystem? What about programmatic? You know, the trade desk, even Walmart's ad graph, how Shopify has their own, uh, ad audience network. Those are all incredibly interesting areas. And there's this whole construct called unified ID 2.0, which is essentially a PII based identifier that is gonna be used by a lot of these companies. So there's two very divergent streams. And while it's all in the name of privacy, we gotta be very careful to understand it a little bit more in a, in a nuanced capacity of, there's also a competitive piece to this. And there is a, a, uh, a market dominance piece of this. And, and that is a challenging place to be because you both want a more private, safer web experience, but you also don't want to give, uh, this, this massive monopoly more power.
Speaker 1 (40:36):
Uh, and by the way, I think it's really, that's, what's really interesting in Europe when Google's talking about getting rid of third party cookies. Well, that's really good from a privacy point of view from a, uh, cross cross domain, uh, identification piece, but that also really hurts competition. And you really, uh, uh, advantage Google in that scenario. So on one hand, you've got the privacy folks there going get rid of these things and you you've got the commerce commission, uh, saying, no, you can't do that because that is a, a, a monopoly. And so what do you do if you Google in that scenario?
Speaker 4 (41:06):
Yeah, I wrote, um, I'm sure there were some issues with it, but I wrote a post. It might be a couple years now ago. That was basically when, when platforms control the auctions. I forgot what, what exactly title was like, the, the auctions, the bidding, the targeting, and now the cookies, right? Mm-hmm <affirmative>, these platforms are controlling everything and, and it's not even, it's not even just that they're controlling to your point. All of that now in the name of privacy is completely obfuscated from external viewpoint. Like you can't even identify what's happening in there, which means they literally hold all the cards, which we'd probably be foolish to not have some concern over.
Speaker 1 (41:50):
Yeah, it's, it's a complicated, it's, it's very complicated though, because if you fully open everything up in the name of transparency and competition, you have created a huge privacy problem. Exactly. And I don't think we're going back to that place. Uh, and, and I, I don't have a good answer to, to what the future state could and should look like here. Uh, I think it, it more comes down to investment tolerance levels and the scope to which individual or brands are still willing to invest in these places and the, the bill that's going through the, the house right now is really interesting. Although very, very much just clearly an attack on Google and Facebook, uh, where they're talking about essentially needing to strip out different functions of the ad buying model, uh, within those companies. And I look at this whole thing and go, the problem is all the cloud computing that they also own, like, that's the problem with Amazon.
Speaker 1 (42:37):
That's why they like, they can be so monopolistic in all these other areas, cuz they own AWS and that just fuels everything. And maybe it's only 8% of Google's revenue now, but that is a massive, massive future source of revenue for Google that they are gonna be using to also dominate, not just all those pieces we're talking about, but also that data processing and even the core stability of the web. Uh, so when they own everything there's, and, and this is true by the way of a lot of other companies too, like Verizon and T-Mobile and you see all these things, you're like you own all the web traffic streams. That's an incredibly powerful place to be sitting in cuz you can, you can advantage different platforms or you can advantage different ad platforms. There's all these pieces in play that can very easily, uh, be manipulated by bad actors.
Speaker 4 (43:17):
I just, I just watched a documentary on wall street, like gaming wall street. Yeah. It was, it was interesting. I mean every, every documentary, I mean that, that one in Netflix that came out about privacy in that OB obviously had certain biases. What was it called? It was, it was about the Cambridge Analytica thing. It was the big one. Everyone talked about it, you know, whatever it was.
Speaker 1 (43:39):
I, I honestly, at this point, I can't remember. I I've seen so many, I've so many things I've seen so many things
Speaker 4 (43:44):
I'm trying to blank.
Speaker 1 (43:45):
Yeah. But I've come to the conclusion that if Cambridge Analytica was about dog food, nobody would've and the problem became politicized. Yeah. Yeah. But imagine a world where political advertising is the only thing that you're not allowed to do on, uh, through, through digital networks, that would be a very different world in terms of the limits that could be applied. Like basically right now the politicians have no real reason they want to limit any of this. Cause they know they need these networks for reelection. They, they know they need to utilize these networks to gain favor with the population. So they don't have very much interest in really, uh, destabilizing these networks, even though there's a lot of rhetoric around it. That's why I don't see anything happening until we get to a point where it does somehow personally affect them.
Speaker 4 (44:30):
Hmm. Yeah, no, that's interesting. Yeah. I was, uh, I was watching that on the, the gaming wall street and it was just interesting hearing as an outsider, just on the economic side, like one of the things that, that one of the experts that they interviewed, I can't remember she worked for the or something, but she was basically like, look, you know, um, she's talking about naked, short selling, which I, it was kind of a new concept for me. Basically they make up these shares, these, these short selling these short sales so that they can just, they just make them up on a thin airs. They can, they can make money. It's just incredibly easy to game because like it's such a complex kind of hidden system that no one really knows what's going on.
Speaker 1 (45:10):
Speaker 4 (45:11):
And so kind of those who know what they're doing, kind of the big players, it is, it is easy to, you know, manipulate things in that, which again, I don't, I don't know all that biases that go into documentary, things like that. It's not difficult for me to expect. Sure. I I'm sure that happens. You saw some crazy stuff go down in, you've seen with Enron and all, all of the above. Right. And, and it also wouldn't surprise me just for similar things in our industry, you get this amount of money and this amount of kind of big, big players, everything is at stake and it's just all very, very hidden. I think there's a lot, there's a lot of a mess to this that that needs to be figured out that probably won't be figured out well, but hopefully at, at the very least, as we're all trying to kind of get to a better place, hopefully we'll get there. <laugh>
Speaker 1 (45:58):
Yeah, no, I, I hope so. I, I think, I think there'll be a, there's gonna be fits and spurs. I, I think we will see some degree of, well, additional states will be bringing in different degrees of regulation. I'm all for consent based framework for the future. I think that that is by far and away, the biggest takeaway here, just like, well, from a consumer sentiment point of view, 10 years ago, I might not have gone out and bought an electric cat. I, I cared about the environment, but did I care about it $10,000 more than this other, this other option, right. Um, maybe not I've changed. I, I I've grown. I, I, I I've realized that I've gotta do my part. I I've, I've gotta help with this. And, and I think it's very similar to the way that we'll see privacy over time is that we'll have, uh, uh, you know, big swings in terms of gut reaction and decisions.
Speaker 1 (46:40):
But over time, we'll get to a point where we realize that privacy is, uh, it's an inherent right in the constitution. It's something that we should be very aware of. And we need to be thinking about how do we, how do we respect individuals privacy on the internet while also providing them with the utility and opportunity to use the internet in, in a, in a free, uh, open capacity, uh, and not one that gets locked down by paywalls or by quite honestly just terrible, uh, advertising experiences, because there's just so much bloat out there now that that targeting, uh, you know, is, is seriously diminished. So all that I think does come back though to the consumer, having the right to both consent and ultimately monetize their own data so they can have a fair value exchange in this play.
Speaker 4 (47:18):
Mm-hmm <affirmative>. Yeah. I, I think it's worth noting that one of the reasons why moving away from targeted and advertising and to, you know, what you were describing. I think one of the reasons why that's not great for the overall industry is all that does, is reward the deepest pockets. Right? Um, mm-hmm, <affirmative> it's just the, the person with the most money to throw out wins because targeting also allows us to get a little bit more granular in saying, Hey, we'd only like to pay for these people because we can identify those people as opposed to just like, well, I guess we're advertising to everyone in this region again. Right. Well then your ad placements, that's all limited. And then you have more players for those. Who's the, so again, you know, a negative impact to the, the eco ecosystem that is the internet and advertising that, that isn't really shown in the quick little prompt in apple.
Speaker 1 (48:15):
So, right. No, absolutely. I mean the, the internet is the greatest thing happened for democratizing access to small businesses and to allowing a lot of different folks and a lot of different advertisers to actually get into that market. Uh, versus if, you know, if you look at TV the way it was always historically done at a national level and being part of the upfronts, all these things, unless you're a car company or an insurance company, or, you know, these massive players, you were going on local TV and that's not that great either. Cause just cause you live in this area, doesn't mean you want my product. If I'm selling, uh, doll houses, uh, in traditional New Zealand Calvin style, I've got a very new model
Speaker 4 (48:51):
Speaker 1 (48:51):
I dunno who I would target. You can't hack the entirety of New Zealand. You know, I, I wanna be targeting a very specific group and, and ultimately that is, that is the goal of good marketing it's to bring awareness to products that would make people's lives better. That is some, some is lost on, on, on many of these privacy advocates who are, who are purely looking at this from a, uh, data misuse perspective and, and, you know, their, their, their lens is very, very focused on their, their single objective without taking to account the, the broader ecosystem. But, you know, I'm trying to come around as much as I can to see through their lens. Uh, and at the end of the day, I, I always do the, my mother test. And that is, uh, uh, a little test that I have in my head where I go, would I be okay if my mother was going through this experience?
Speaker 1 (49:32):
And what I mean by that is I know how to protect myself online. I know where I should be stopping. I know where I should be putting, like utilizing a VPN or, uh, uh, you know, not sharing certain pieces of data that might not be true with, with my parents or, or other older folks. And so I always try and put myself in their shoes and go, is this an experience that they should, they should be going through? Is this the kind of data that would, I would be accepted except if my mother's data was in here? And if the answer is yes, then I feel like I'm doing something ethical that, that, that I'm working with in the realm of this is ethical and useful and, and valid. If it's not, then we probably need to challenge some of our, uh, historical constructs of why we click so much data, how we collect it, how long we store it for, and ultimately what we use it for.
Speaker 4 (50:10):
Mm-hmm <affirmative>. Yeah. That's awesome. Thank you. I would love to keep chatting, but probably I'll I'll respect your time. Um, where, where can people find you online?
Speaker 1 (50:19):
Yeah, absolutely. So, uh, folks can, uh, connect with me on Twitter. Uh, my Twitter handle is SSON. It's my last name. So S P O U L T O N uh, otherwise checking out on LinkedIn. Um, and if you're interested in working with me WMO, we are absolutely hiring and, uh, we would love to get some, some new phases through the door. So, uh, check us out w pro.com, but that's all for me.
Speaker 2 (50:44):
This has been a bonus episode of the PPC ponderings podcast. Keep checking back for more interviews and our next full episode, if you like, what you hear, please consider sharing this with your network, leaving us a review on apple podcasts until next time may the auctions be ever in your favor.