Let's throw caution to the wind and wade into the giant, legally confusing territory of digital privacy. In this episode, we talk with two privacy experts. While neither is an attorney (they both wanted us to point that out), both have extensive experience pondering the reality of digital privacy, as well as privacy's lesser known but more important sibling, digital security.
We talk data rights, privacy definitions, and why the ecosystem that is the internet has to stop being ignored in this debate as zero-data enthusiasts push for increased legislation. How did Apple's iOS14.5 impact businesses and privacy? We'll hit these questions and more in this important episode.
Episode Guests:
In this episode, we interview two people who deal with this on a daily basis:
- Bethany Singer-Baefsky , Director of Privacy at iRobot
- Simon Poulton, VP of Digital Intelligence at Wpromote
And, make sure to watch for each of these interviewee's individual episodes, as these were masterclasses in privacy!
Episode Resources:
Here is a list of resources we either consulted, or think would be helpful in continuing to inform your understanding of the things we discussed on the supply chain crisis.
- Avast’s Shutdown of Jumpshot Will Harm the Web and the World by Rand Fishkin
- Jumpshot Profile on Crunchbase
- Avast CEO message on Jumpshot (January 2020)
- Chris Johnson's Full Bonus Interview on ZATO
- Follow Eric Seufert on Privacy Updates
- John Oliver on Data Brokers by Last Week Tonight
- What is GDPR?
- American Data Privacy and Protection Act Bill Draft from the US Senate
- What the FLoC is the Privacy Sandbox? by Kirk Williams
Episode Transcript:
Kirk Williams (00:00):
Raise your hand if you've heard of Jumpshot. No, not thebasketball term for leaping upward in a vertical motion while simultaneouslyshooting the ball towards the net. I mean, the Jumpshot marketing analyticscompany. Well, former marketing analytics company, since it was shut down inJanuary 2020, right before the world entered a global pandemic.
Kirk Williams (00:20):
Anyway, what the heck is or was Jumpshot? According to itsCrunchbase profile, "Jumpshot delivers digital intelligence from withinthe Internet's most valuable walled gardens." Wait, what does "withinthe Internet's most valuable walled gardens" mean? Doesn't like a walledgarden purportedly referring here to big tech, such as Facebook, Amazon, andGoogle, doesn't that mean the data is unreachable? So how was Jumpshotaccessing that unreachable data? Here's how Jumpshot worked. Jumpshot's parentcompany, AVAST, their users would be given the option to opt-in to sharing datawith Jumpshot, who they referred to as "our subsidiary."
Kirk Williams (01:00):
No, seriously, it was all right within their opt-inrequest screen as shared by Rand Fishkin. Then using the AVAST Chrome plugin,user data would be aggregated and anonymized, all that user data. By user data,we mean user click activity within the Chrome browser on any website. Ponderthat for a second. And then you'll understand probably why the CEO of AVASTeventually just pulled the plug on Jumpshot in January of 2020. He notes in hisrelease, "I came to the conclusion that the data collection business isnot in line with our privacy priorities as a company in 2020 and beyond."Well, glad they shut that down. Do we really all want someone tracking everyclick we make on the internet? Well, hold up again. All right. Because it's notquite that simple. And Jumpshot is, in my opinion, a great look at thecomplexity of the privacy debate and why I wanted to start here.
Kirk Williams (01:52):
Rand Fishkin, former founder and CEO of Moz and currentfounder and CEO of SparkToro writes the following ponderings about the Jumpshotdebacle. I quote, "AVAST provides a free product. It then asks if you arewilling to share your data in a way that will be aggregated, anonymized andsold. Many people said 'yes' to this. I cannot find a way to think about thatas ethically wrong." He goes on further down in the article to say,"My greatest fear is that this weaponization of privacy rights as acontentious issue will shut down more and more aggregators and providers ofinformation that let small, medium, and large competitors to the big techmonopolies compete against them. It will shut down the abilities of the press,of the government, of big tech credits like me to call those firms out on theirmisleading, incomplete, or false claims. And ultimately it will lead to morepower and wealth concentrated in the hands of the few versus thedemocratization of data we all should support." (End quote)
Kirk Williams (02:48):
Okay, let's think about this. Is it possible that the rushto digital privacy is being confused with the necessary issue of digitalsecurity? And in my opinion, that's Rand's primary concern here, and one Ithink the world needs to ponder as well. Whatever your opinion of Jumpshot is,it does raise a critical question as the focus on digital privacy ramps up.What if a confusion between privacy and security is actually harming the waythe free and open internet has always worked? What if we need to betterunderstand the complexity of digital privacy before we begin to react to orregulate it? That's what this episode's all about.
Kirk Williams (03:29):
Welcome to the ZATOWorks PPC Ponderings podcast, where wediscuss the philosophy of PPC and ponder everything related to digitalmarketing. Our hope is that through these conversations with professionals inthe digital marketing space, we can gain a better understanding of what ishappening in the digital landscape and better prepare all of for the future.
Kirk Williams (03:56):
For this episode, we talked with two professionals in theworld of digital privacy, but with the different backgrounds.
Simon Poulton (04:01):
So my name is Simon Poulton. I am the Vice President ofDigital Intelligence at Wpromote. I've been with Wpromote now for about sevenyears. And I've always been just really engaged by the data space.
Bethany Singer-Baefsky (04:15):
My name is Bethany Singer-Baefsky, and I'm the Director ofPrivacy and DPO, which stands for Data Protection Officer at iRobot, where I'veworked for about the past three years.
Kirk Williams (04:26):
As we began to dig into the world of privacy with Bethanyand Simon in preparation for this episode, it quickly became clear that thereare no easy answers in this topic, even in seeking a simple definition forprivacy.
Bethany Singer-Baefsky (04:38):
Yeah, that's a key question. No, that's actually key tothe whole thing is, what are we talking about when we're talking about privacy?Well, privacy itself is a vast thing that encompasses personal privacy,communications privacy, information privacy, all of these different functions.The concept of privacy goes back to the invention of doors. It's afundamentally human concept. But when we're talking about privacy laws today,we are primarily talking about information privacy. And what informationprivacy means is an individual's right to have knowledge of and to understandexactly what it is the company is doing with the personal information that theyprovide to that company or that the company has obtained about them. It's theirright to exercise some measure of control over that information. So it's thisidea that a company has certain responsibilities and individuals have certainrights.
Simon Poulton (05:40):
I think the first thing, though, that we need to do isframe up the question of what is privacy. And that probably sounds like anoverly redundant question, but it's very important that we think about privacyand security as being two independent things.
Kirk Williams (05:54):
Hold on. What does that mean? What's the differencebetween privacy and security?
Simon Poulton (05:57):
So fairly recently and over the past year, I believe Applehad an ad on TV that they were running, talking about the privacy that existson the iPhone. And one of the components of that ad was somebody in a parkgoing, "My credit card number is: 1, 2, 3, 4, 5, 6," etcetera. Thatto me is not a privacy issue. That is a security issue. There is no company thatI know of that would ever be tracking credit card numbers, short of credit cardprocesses, because that is their function. And they all have secure functionsin play there. And I put this out there because I think the narrative has beenvery much driven by Apple. I think Apple is ... they are doing it by design.They are creating a message that privacy is incredibly scary and you should beafraid of any piece of data that you have out there in the world.
Kirk Williams (06:48):
Privacy is all about managing user consent, whereas securityis all about protecting against hacking or theft. This is an importantdistinction because for many people, the idea of their data being used by thirdparty, advertisers has this abhorrent feel to it. How dare those capitalistpigs use my data for their own personal gain! Here I am navigating this freeweb and they're lurking, waiting to grab my data and use it wherever they can.
Simon Poulton (07:17):
Facebook and Google seem to be getting the brunt of theattacks in this. When really, it's the bad apples. It's the third-party datasellers. It's the folks that when John Oliver was talking about this fairlyrecently on Last Week Tonight, he was talking about some really, really badstuff and that's really ugly. But that ruins it for the rest of us. That hascreated a problem in the industry. And when you do not have any regulation,when you do not have any guidelines around these things, people will go rogueand they will take things into their own hands. And that is a very real problemthat we need to be very cognizant about how do we address. And I think we do thatthrough being forthright, open and transparent with consumers. And regardlessof what the regulation says, always looking for consent. Consent Is the numberone thing we need to be focused on.
Kirk Williams (08:00):
Okay. So certainly, there are slimy data brokers utilizingdata they have no right to, or that they have tricked users into handing overwith obtuse terms of service agreements. But is access to any form of user datathe demonic Vecna-like threat to personal security we've been led to believe itis? I contend that digital security and privacy are not the same thing. Digitalsecurity is about core threats to trust and personal freedom as hackers andthieves steal information that is not theirs, to be used for nefarious purposesthat can really harm someone.
Kirk Williams (08:33):
Can we all agree that ... We all agree that, that's wrong?So what about non-harmful, non-security related data? What about the knowledgethat you are let's say, a 45-year-old female who likes to shop at Target anddrink Cortados from Blue Bottle Coffee. Is access to and usage of thatinformation really like, honestly, that big of a deal? This is part of thecomplexity of privacy. Since there will be disagreement among us on what datais considered secure or sensitive to different individuals. And that'sunderstandable and normal. And it's why user consent is so crucial to the ideaof privacy. More on user consent later. And it is pretty important in thisconversation.
Bethany Singer-Baefsky (09:17):
The thing that I always say when it comes to privacy, isthat more often than not, the answer is going to be, "It depends." Itdepends on what the data is. It depends on the context in which the data iscollected, how the data is stored, who's using it, and for what purposes. Andalso depends on what the regulators say about it.
Kirk Williams (09:37):
The topic of internet privacy has hit an inflection pointas consumer interest continues to rise. But yesterday, like no, literally, June6th, 2022, yesterday, a bipartisan federal bill draft of epic privacyproportions were released. And it has the potential to be a doozy. At the timefor recording this episode, a number of states have individual privacy laws.But there hasn't yet been a federal law similar to the European Union's GDPR.
Bethany Singer-Baefsky (10:02):
The reason why this is snowballing in importance, thatstarted really with the passing of the GDPR. The four-year anniversary was justlast week. May 25th, 2018 was when GDPR went into effect.
Simon Poulton (10:16):
The date that it came in was Memorial Day weekend, 2018.And the reason I remember that is because that is my wedding anniversary. Thatwas actually the day that I got married. And I had everyone and their motherreaching out to me going, "Hey GDPR, what's up with this?"
Bethany Singer-Baefsky (10:32):
And that was the thing that really spurred a lot ofcompanies into action and really put it on the radar of not just legaldepartments, but also executive teams, boards of directors. It really broughtthe eyes of the world to Europe because all of a sudden, if you didn't protectdata, personal data specifically, up to a particular standard, you risked notbeing able to do business in the European Union. You risked fines, you riskedlosing customer trust.
Simon Poulton (11:02):
So of course, GDPR was a really big fundamental shift inthe idea of who owns this data, who has the permission to collect data? Whatrights do consumers have over how you use their data after it's beingcollected? Or if you share it with other groups, if you augment it, if youenrich it?
Kirk Williams (11:17):
We all know regulation is coming, of course, especially asconsumer sentiment has continued to wind its way towards a more private world.And that's not necessarily a bad thing.
Bethany Singer-Baefsky (11:27):
The reason for why the GDPR passed ... I mean, the reasonsplural, there's a multitude of them. There was a concern over US governmentsurveillance. There is also the whole surveillance capitalism that is becomingmore and more discussed today with, you know all about that, the ad tech sideof things, right? So there was, all of a sudden you have all of thesetechnologies where we are getting access to the internet and everything thatentails for free. But free isn't really free. What is the cost, right? And sopeople were suddenly realizing that, hey, when I look up a pair of shoes, andall of a sudden I'm seeing ads for the same shoe when I am on my laptop insteadof my phone, and I'm on a totally different website, how did that happen,right? And so there was this growing awareness of: Wait a minute. There's a lotof data collection going on and maybe we should do something about that.
Kirk Williams (12:21):
To be clear, we're big fans of privacy at ZATO, my agency.I'm personally probably more of a privacy advocate than many of my advertisercounterparts. And this likely has to do with the six children I currently haveliving under my roof. The thought of private details of their lives beingexposed to an unforgiving outside world has me personally willing to lose someadvertising targeting as collateral damage, if you will. But is it a littlemore complex than some forced dichotomy of either advertiser targeting orhealthy user privacy? Do we have to choose between two extremes?
Simon Poulton (12:56):
There are these data zero groups who are largely the ITPadvocates, the ones who are saying, "There will be no degree of trackingon me at all." And then there's the more moderate group which I would fallinto, where we are very aware of the ethical concerns and considerationsassociated with data utilization. We're very mindful of what this means. And weare advocates for putting in place some degree of guidelines in terms of data utilization.
Kirk Williams (13:26):
So let's take a step back and ponder the idea of a free andopen internet. This is actually a crucial part of the privacy discussion. Yet,it's the part I believe most often gets missed by non-advertisers. Why didthose grimy advertisers need our data anyway, right? Back off dudes. Leave mealone.
Simon Poulton (13:43):
But the problem right now is the data zero folks are theones who are, they're winning the day, right? And so, when you look at thingslike Apple's app track and transparency, they forced a prompt on every appdeveloper out there to say, will, you consent to being tracked? And that's areally harsh message to give folks. It's reality. But when you can inform auser through the value exchange and at least give them that line of sight as tothis is why, you don't want to see the same 50 ads for car insurance. You don'town a car, right? These are things that are quality of life measures. And alsothe reason why a lot of things on internet are free .and we've now grown up inthe internet free generation. And it would be a very scary scenario if we hadsome degree of ... if all these functions on the internet, if you had to pay touse them.
Simon Poulton (14:32):
Some folks would love that. But that's a really elitistview of the world. The internet is for everyone. And if you barely making it,buying, you're on the minimum wage, you still have the right to use GoogleMaps. You still have the right to do those things. And the fact that they'reable to provide that through advertising support, I think is one of the mostbeautiful things about the internet. And that it has democratized access toinformation and function and utility. As soon as you start to say,advertising's not going to be as functional in these environments, then you puta cost in front of it. That's maybe fine for me and you, but that's not finefor many, many millions of users out there in the world.
Kirk Williams (15:06):
In order to have a healthy discussion about digitalprivacy, the ecosystem of the internet needs to be properly understood.Remember Chris Johnson, from our Supply Chain episode? In his interview, wechatted iOS14.5 and impact that had on users and why that's an important partof this discussion right now.
Chris Reeves (15:24):
Yeah. I think the mega trend we're seeing in eCommerce andjust in general on advertising is the consumer is very savvy and has choicesand options. And when it comes to privacy, in terms of privacy, with theirdemand, it's really, they want to have more control over their information anddata. We know that. That's been a trend that's clear. I think Apple and iOS14,what was happening is that what the release of the new update for iPhones,iOS14 is the update on the operating system. What's going to put in a promptthat would say, would you like these apps to track you in these ways?
Kirk Williams (15:57):
This is one of the reasons the great iOS14 debacle wassuch a big deal. Because many advertisers decried Apple for the way that Applewent about framing their privacy consent option that they began offering users.
Chris Reeves (16:09):
Once it rolled out, across iOS devices, it got pushed backa bit, but it rolled out. We are seeing really, really poor return on Facebook.Instagram, Facebook, all meta products. It was not a performance loss. It was apullback of spend and a loss of data reporting. So iOS14 goes out and removesthe reporting of this convergence. But the consumer behavior was the same.People were still buying on the ads that people were seeing. Meta just couldn'ttell you that they were.
Kirk Williams (16:38):
Many of us advertisers believed Apple didn't accuratelyreveal the full impact the user choice would have on the entire ecosystem of afree and open internet.
Kirk Williams (16:49):
So what is this ecosystem? Well in any commerce system,there's a creator/seller and a buyer. In the internet, the creator/seller willtypically fall into three categories. First, eCommerce or product-basedcompanies. They sell a physical product that you can purchase. An eCommercebrand selling shoes may also have a helpful running blog that they upkeep at nocharge to you. This blog gives you loads of free running tips and information.It's valuable to you. They don't give you all these tips for free, though, justbecause they like you. It's because eventually they are really wanting to sellyou their product: running shoes.
Kirk Williams (17:25):
The second category then is service-based companies. Theyoffer a service that you can purchase. Your plumber, let's say, might have awebsite or even a blog that he upkeeps in order to rank on Google, by providingyou helpful information on basic, do-it-yourself home plumbing solutions,knowing that when you actually need his help, you'll call him. He's offeringthat content for free supported by his services, which then you purportedlypurchase at some point when you've broken the sink and you need him to fix yourdo-it-yourself solution.
Kirk Williams (17:56):
The third category is typically the most controversialone: content-based companies. These companies are a chunk of the internet andthey look like many different entities. They may look like a social mediaplatform or a local newspaper website, or a large industry focused forum, orsearch engine. How do these companies make their money? Well to quote MarkZuckerberg in his famous Washington, DC robot meme hearing, "Senator, werun ads." The benefit of companies such as Facebook on society is a topicreserved for another podcast episode, but the service they offer is not free.It is supported by ads paid by other companies, and this allows them to offertheir content or services to their users at no charge to those users.
Kirk Williams (18:41):
So the way the internet has survived for decades isbecause brands are willing to pay for ads on many of these content websites,and thus support the ecosystem that provides a helpful service to users. Backto iOS14.5. One of the main issues advertisers had with the way Apple showed analert, an option to users, wasn't solely that it was giving users the abilityto easily consent to being tracked or to be more private. It was, and payattention, because this is a really crucial point. It was that Apple presentedthis in a way that didn't fairly explain the dynamic of this ecosystem to thecommon user and how their choices would impact that ecosystem.
Simon Poulton (19:23):
That's the thing is Apple is so good at marketing, andthat is, I think, a large part of the problem is that they have single-handedlyalmost defined what the narrative should be. And we do lump them together as asociety and the way we talk about these things. It's all privacy, not aboutsecurity. But security by far and away, is my number one concern in the digitalecosystem.
Kirk Williams (19:40):
Okay. So why is user data needed for advertising? Why notjust go back to a world of billboard ads where everyone fights for the same adplacement that shows to everyone in an area. Just some giant Thunderdome battlewhere only the strong survive.
Kirk Williams (19:54):
Yes. Sonny Beck in my day, do advertisers need battle tothe death for placements on Madison Avenue?
Kirk Williams (20:02):
Well, think about it. In that arena, who do you think isgoing to win the ad placement? Because remember, it's an auction where the onewho bids the highest will win. That's right. The winners, again, are the bigbudgets and deep pockets. You see, smaller businesses use user data, anonymizedand aggregated, and we'll talk about that later, to target and spend money onads to people who will be a good fit for their product. If you sell cold brewcoffee cans, then having a targetable audience of coffee lovers is valuable toyou as an advertiser.
Kirk Williams (20:35):
In other words, most small businesses and advertisersaren't trying to steal consumer personal data, so they can cackle in theirlayers and wipe out the world banking system. They just want to know who thepeople are who will buy their product and then show ads to those people. Andironically, this is where the beauty of the internet and targeted advertisingcomes in. Because the people who have those interests really do want ads thatare more relevant to them, as opposed to the alternative, which is just to seeads from deep pockets like Walmart and Amazon all the time,
Simon Poulton (21:07):
The internet is the greatest thing ever happened fordemocratizing access to small businesses and to allowing a lot of differentfolks and a lot of different advertisers to actually get into that market.Versus if you look at TV the way it was always historically done at a nationallevel and being part of the upfronts, all these things, unless you're a carcompany or insurance company, or these massive players, you were going on localTV, and that's not that great either. Just because you live in this area,doesn't mean you want my product. If I'm selling doll houses in traditional NewZealand carving style, I've got a variance model for that. I don't know who Iwould target. You can't target the entirety of New Zealand. No, I want to betargeting a very specific group. And ultimately, that is the goal of goodmarketing. It's to bring awareness to products that would make people's livesbetter.
Kirk Williams (21:54):
So let's go back and reconsider our question from before.Is access to and usage of non-security issue, user information data really,well, that big of a deal? Now let's talk user consent.
Bethany Singer-Baefsky (22:09):
So consent is hugely relevant and it's actually becomingincreasingly important, especially this idea of consent before data collectionand opt-in versus an opt-out model. The United States is still primarily anopt-out model. But the rest of the world is really moving more towards opt-in.And I think you'll see that trend continuing.
Kirk Williams (22:28):
Access to personal information without user consent? Sure.Most people would agree that information which can personally identify usshould be limited and controlled by the user.
Bethany Singer-Baefsky (22:39):
The definition of personal data is some iteration of datathat identifies or relates to an identifiable individual. So that can be thingsthat are directly identifiable. Things like full name if it's a distinctiveenough name. Email addresses, telephone numbers, social security numbers; obviously,sensitive information like your health information or precise location. Thingslike that.
Kirk Williams (23:06):
In this day and age, most PII is already aggregated andanonymized. You might hear those words used in this episode or in articles yourun across. Aggregated simply means that it's grouped with a lot of otherpeople who share that demographic; let's say, all Blue Bottle coffee drinkers.And anonymized simply means that data has been collected in a secure way, so itcan be access in order to identify where that data came from.
Bethany Singer-Baefsky (23:28):
But then there's also, you can have a bunch of indirectidentifiers: IP addresses, other unique IDs. And those have been determined tobe personal information because they relate to somebody who is identifiable.And that I think is where a lot of the challenges come in here because there isalso this concept of pseudonymization or deidentification, where you ... Withpseudonymization, you basically replace the identifiable information withsomething non-identifiable or de-identification where you entirely separate theidentifiable data from the non-identifiable data. So for example, if you have,I don't know, a list of full names and precise geolocations together, those aregoing to definitely be personal information. But if you just look at the listof full names, if everybody on that list is named John Smith, that might not bePII.
Kirk Williams (24:29):
In other words, let's say you are user 1037543876 on someinternet advertising platform. And I, the advertiser, may know that, that user,you, you want to see a coffee ad, but I don't actually know that you,personally, that you, Sam, you're personal user 1037543876. See the difference?Nobody actually knows that it's you, even if we can advertise to you,personally. Anonymization is actually cool.
Bethany Singer-Baefsky (25:00):
This is actually where some of the interesting privacysandbox stuff comes in, that Google's doing right now. Things that arepotentially browser-based or that use differential privacy or can anonymity,but without getting into all the technical stuff there, what you describedalmost sounds like what they're doing with topics of ... We have thisaggregated anonymized data set. And so it really would depend. It really woulddepend on basically, what data is shared and how it's shared. If it'sanonymized and aggregated, then more than likely it's not personal information.However, anonymization is a tricky thing, right? It has to be completelynon-reidentifiable, and that's extremely difficult to achieve.
Kirk Williams (25:45):
So now that we understand that, let's think about thisquestion. If your data is already being anonymized and aggregated, so nobody,platform advertiser brand, et cetera, knows who you are, does that really putyou personally at risk? I would argue that it doesn't really. See thedifference between privacy and security? If nobody knows it's you, then yournon-critical data isn't actually harming anyone. So then it all comes downagain to user consent in a privacy conversation. You're about to hear thephrase user consent, by the way, come up a lot as privacy regulations andconversations pick up in our world.
Simon Poulton (26:20):
And at the end of the day, I always do the my mother test.And that is a little test that I have in my head where I go, would I be okay ifmy mother was going through this experience? And what I mean by that is, I knowhow to protect myself online. I know where I should be stopping. I know where Ishould be putting, utilizing a VPN or not sharing certain pieces of data. Thatmight not be true with my parents or other older folks.
Simon Poulton (26:44):
And so, I always try and put myself in their shoes and go,is this an experience that they should be going through? Is this the kind ofdata that I would accept if my mother's data was in here? And if the answer isyes, then I feel like I'm doing something ethical. That I'm working with in therealm of this is ethical and useful and valid. If it's not, then we probablyneed to challenge some of our historical constructs of why we click so muchdata, how we clicked it, how long we store it for, and ultimately what we useit for.
Kirk Williams (27:08):
So let's close this episode with this question. How doesuser consent fit into this? This may be the crux of the privacy question, andthat is, should we just let users decide what data they're okay in sharing. Ifprivacy, unlike security, is primarily about user data being shared to supportthe free and open internet, then it seems to me like better informing,educating, and offering the choice to consumers for user consent of their datais really a smart way to go.
Bethany Singer-Baefsky (27:35):
There is this principle in privacy, basically calledpurposes of processing. And so if you provide consent for one form of dataprocessing, that doesn't necessarily imply that the company can just use thatconsent for about different purpose of processing. Twitter was actually justfined for this recently where they took phone numbers. I think it was foraccount verification, but then used those phone numbers in targeted ads. Thoseare two separate purposes of processing. And if consent was provided forPurpose A and not for Purpose B, and Purpose B isn't obviously compatible withPurpose A, you can't do that, at least under GDPR.
Bethany Singer-Baefsky (28:15):
This system existed. This platform existed. Data was collectedfor a specific purpose. And as the platform evolved, data collection practicesevolved. And it begs the question of, okay, as these practices evolved in theseprivacy settings and became increasingly granular within the privacy settingsection, is that enough?
Simon Poulton (28:34):
Now, this is where it starts to get really tricky becausea lot of brands have historically utilized some degree of third-partyaugmentation provider. And what I mean by that is, if you come to my websiteand you give me your email, we're now in an agreement that I have access to thatemail and I can use it to share information with you. I can email you things. Ican take that and I can pass it into Facebook, and they can identify it againsttheir first-party data to then target you within their network. Usually that'swhat's covered by a privacy policy.
Simon Poulton (29:03):
The question though that I think is particularlyinteresting is do you have the right or do you have the ability to then augmentthat against the third-party database? And I'm thinking about things here likethe Wunderman Thompson database. So you've told me, yep, my name's Kirk, andthis is my email. And I go put that in the third-party database and I go, oh,well it turns out that Kirk's a big Star Wars fan. And it also turns out thathe lives in Montana, and all these other things that you didn't actually sharethat with me. And that's a very interesting augmentation of third-party datacoming into a first-party realm.
Simon Poulton (29:35):
Now a lot of that's getting stamped out because that'sreally quite unethical. And this is where we're seeing the formation of CleanRooms moving forward. So clean rooms are a really interesting space likeGoogle's Ads Data Hub, Facebook's Advanced Analytics. The big joke in[inaudible 00:29:51] land right now is as soon as Netflix announced they'regoing to have ads on their platform, everyone was saying, I'm so hyped for theNetflix Clean Room. Every platform's going to come out with some degree ofclean room. And the idea there is that you can have a privacy resilientenvironment where PII is not exposed, but you can do that degree of matching.And I can see at an augmented level that I have a thousand people in this, inthis group that I'm targeting, that all like Star Wars. And so, I'm not goingto be singling you out, as Kirk. And I'm also not going to have thatinformation directly on you. I will just have an aggregated view of that, becausethere'll be privacy checks in play to mitigate any degree of individualidentification there.
Kirk Williams (30:28):
This allows advertisers continued access to acceptable andaccurate user data that offers secure, targeted ads. And it allows consumers toaccept and trust this whole internet system, ecosystem thingy that they're apart of without it actually harming that ecosystem in any way. The ongoingprivacy discussion is complex and it will continue to remain so. But it's acritical discussion to have in maintaining the beauty of what the internet hasalways been. And we hope that this podcast episode only helps to at leastfurther that conversation along. Until next time, I'm Kirk Williams. And maybeauctions, anonymized and aggregated, be ever in your favor.
Closing Song (31:09):
Oh, the money that better I spend. I spent it in goodcompany.
Chris Reeves (31:24):
This has been the PPC Ponderings podcast. This podcast wasproduced by me, Chris Reeves. Kirk Williams was the assistant to the producer,interviewer, writer, and narrator.
Chris Reeves (31:35):
Special thanks to our guest interviewees, the Director ofPrivacy and Data Protection Officer at iRobot, Bethany Singer-Baefsky. And theVice President of Digital Intelligence at Wpromote, Simon Poulton. Finally,it's worth noting that neither Kirk, ZATO, or his guests, Bethany and Simon,are lawyers, and the contents of this podcast should not be constituted aslegal advice. If you'd like to get specific answers to questions pertaining toprivacy in the digital landscape in your own individual situation, we'drecommend consulting your attorney. This podcast was recorded in the ZATOWorksPublishing Studio in Billings, Montana. ZATOWorks Publishing is a subsidiary ofZATO, a paid search agency focused on eCommerce brands owned by Kirk Williams.We look forward to seeing you again next time.
Closing Song (32:25):
This town, and that surely has my heart.