top
Click Here & Buy ZATO Owner, Kirk Williams' newest book on Google Ads - Ponderings of a PPCer: Revised & Expanded.
Kirk Williams
 • 
PPC Podcast

It's All About User Consent - Digital Privacy Interview with Bethany Singer-Baefsky of iRobot

It's All About User Consent - Digital Privacy Interview with Bethany Singer-Baefsky of iRobot

10/25/19 UPDATE: Hello Facebook Agency Visitor Person!  We’re delighted to have you visit this awesome post. About a year ago, ZATO stopped offering Facebook Ads solutions so we could focus solely on what we do best: Google Ads. Because of this, we’re always interested in partnerships with great Social Advertising agencies (like yourself, wink wink!) and we offer referral fees for signed clients!  Anyway, back to it, and happy reading…

Post Summary

Speaking to the Director of Privacy and DPO at iRobot will teach you a thing or two. At least, that was my experience in this fascinating interview with Bethany Singer-Baefsky.

This interview will challenge what you thought you knew about digital privacy as Bethany schools us all on the complex nature of digital privacy and security. While you definitely want to take a listen, I can give you a hint as to the conclusion of this episode: privacy laws are complicated for a reason. People have different opinions about what consent and privacy actually are! Bethany helps us think through these things together in our final episode on digital privacy.

Make sure to listen to this one with your pondering cap on, because there is a ton to think through! If you missed it, go listen to our Core episode on Digital Privacy & Security where Bethany is one of the guests we interview as we ponder the deeper concepts of digital privacy, and how that specifically impacts digital advertising.

Listen on Apple Podcasts

Listen on Google Podcasts

Listen on Spotify

Bethany is the Director of Privacy and DPO for iRobot, Corp. For the past decade, she has built legal, compliance, and privacy programs from the ground-up, across businesses and industries, and her approach to privacy is rooted in data ethics, human rights, and consumer trust. Bethany has an LL.M. in Public International Law from Leiden University, where she was a Fulbright Scholar researching international courts.

In her spare time, Bethany enjoys singing loudly and annoying her cat, two activities that go together remarkably well.


Follow Bethany:

Episode Transcript:

(automated transcript, please excuse errors)

Bethany Singer-Baefsky (00:02):

Yeah, these are all, these are all personal information rights. And so what, what it ultimately comes down to is, is the data in question, personal information, the concept of privacy goes back to the invention of doors. I mean, it, it's a, it's a fundamental human concept.

Chris Reeves (00:23):

Welcome

Chris Reeves (00:23):

To the Zeta works PPC ponderings podcast, where we discuss the philosophy of PPC and ponder everything related to digital marketing

Chris Reeves (00:32):

Today's show is a bonus episode of our full interview with the director of privacy and data protection officer at iRobot Bethany singer biopsy. Bethany is a data privacy expert and her expertise is in full display in this captivating interview. Before we begin, it is worth noting that neither Kirk Zeto or Bethany are lawyers and the contents of this podcast interview should not be constituted as legal advice. If you'd like to get specific answers to questions pertaining to privacy in the digital landscape in your own individual situation, we would recommend consulting your attorney. Let's dive into our behind the scenes conversation with Bethany.

Kirk Williams (01:10):

All right. Bethany is great to have you could you just start off by giving us your, you know, name, title, where you work? Maybe tell us a little bit about that.

Bethany Singer-Baefsky (01:20):

Sure. Well, first of all, Kirk, thank you so much for having me. I'm really excited to be here today, talking with you about all things, privacy and advertising. My name is Bethany singer Beski and I'm the director of privacy and DPO, which stands for data protection officer at iRobot, where I've worked for about the past three years.

Kirk Williams (01:40):

Very cool. And so can you give us a look at like what a DPO is and like, you know, what do you do in that job?

Bethany Singer-Baefsky (01:49):

Yeah, absolutely. So director of privacy and DPO are actually kind of two separate hats. Oh, so yeah, on the DPO side, that's a statutory position. There are a lot of countries and you know, regimes jurisdictions that require somebody to basically represent the interests of data subjects or the people about whom data is collected to the companies that they work for. And that can mean everything from, you know, fulfilling data, subject rights, making sure that the company is doing that correctly, even filing reports in a situation of a data breach. And basically you are giving advice to the company on how to properly protect the rights of data subjects. Even if the company might not feel that that decision is a hundred percent in their best interest, it's your job through that statutory role to provide that guidance regardless, always representing the data subject.

(02:48):

So that's DPO and that's a big deal primarily in the European union, but they've also developed that type of role in China. And in Brazil as these data protection laws have evolved, the director of privacy hat is a little bit different from that they are intertwined, but the director of privacy actually builds out the overall privacy department program function. And makes sure not only that a company is compliant with various privacy regulations, but it's also, it also can be more principle focused. There's a program management aspect to it. And sometimes the DPO does do both of those things, but you know, sometimes it's, I wouldn't say that there's a tension between them, but it's almost like you put on one hat for one thing and you put on the other hat for the other thing. So somebody might bring in the director of privacy to say, Hey, we're working on this new product. How should we incorporate privacy by design? And the DPO side of it might be, how do we make sure that we're GDPR compliant with this new product?

Kirk Williams (03:51):

Interesting. So is the, is a DPO tend to be an attorney or a lawyer?

Bethany Singer-Baefsky (03:56):

Usually it's somebody with a legal background. I am not an attorney, but I do have a legal background. It's a bit of an odd background, normally an LLM degree you get after a JD, but I ended up getting a BA and then an LLM. So I have a master's in public international law and having an international law degree, which I got in Europe really helped me understand GDPR. And that was actually my start in privacy. It was GDPR international law, European law that, that whole jumbled mess of awesomeness

Kirk Williams (04:30):

<Laugh>. And yeah, and, and as we've seen already with like CCPA in California, there's like, like, this is definitely on the radar of the United States as well, however, typically less on the radar. So what have you found in terms of like, which businesses tend to, to really need directors of privies and or DPO? Is there, is there a need for more businesses to be aware of that and utilizing it than maybe are so are there still businesses that shouldn't need to worry about it? I, I imagine some of that is even based on like business size. What, yeah. When should a business start to really kind of think about this at least let's just say in here in the us,

Bethany Singer-Baefsky (05:09):

Yeah, that's a great question. So in the us, it's becoming increasingly urgent because even as there's not yet a comprehensive federal privacy law, there's an increasing number of state privacy laws. We have California, which is the most well known Virginia, Colorado, Utah, and Connecticut just last month. Those are the five that have comprehensive state privacy laws. And then there are other states that have their own kind of more sectoral state privacy laws as well as of course, data breach, notification laws, all 50 states have those. So it, it's definitely becoming more important regardless of what business you're in and a as a privacy professional. Naturally, I believe that if you are handling any type of personal data, you should have somebody on your staff who understands privacy, right? So, I mean, maybe there's a little bit of self-interest there, but I, I do believe that because treating data, personal data with respect and having proper data stewardship is super important, not just from a regulatory standpoint, I think regardless of size, if you are engaging in personal data processing you or somebody on your staff should know the, should know the rules or use it as an opportunity for professional development to become that kind of privacy expert.

(06:30):

in terms of just the urgency for companies, obviously the bigger you are and the more data you process, the more urgent it's going to be. These different laws have different size thresholds for when these laws apply. If you are of course, operating in Europe, or you are directing your products to Europe, marketing to Europe, GDPR is going to apply to you, even if you're not physically present in the European union saying with China. So privacy is global. And even in the United States, you know, we're seeing that in, in Congress there have been, I think, two or three laws at this point that have been discussed. There are discussions going on now. So it's entirely possible that sometime in the next two to five years, there could be a federal privacy law. So I, I think it's a matter of being proactive rather than reactive when companies will really benefit from that.

Kirk Williams (07:20):

So can you walk us through kind of like at a high level, like when we're talking about privacy laws, right? Mm-Hmm <affirmative> like we don't need to go into all the specifics of GDPR and things like that, but maybe even on a big picture level. So the us is thinking about a federal privacy laws, there's state privacy laws. Everyone's concerned about privacy, what maybe at a high level, like, what are people, what are they concerned about? Why are these laws coming into play? What has changed, maybe that is all of a sudden having everyone kind of aware of this and, and what are we even talking about in terms of like privacy? Like,

Bethany Singer-Baefsky (07:54):

Yeah, that is, that's a key question. <Laugh> no, that's, that's actually kind of key to the whole thing is what, you know, what are we talking about when we're talking about privacy? Well, privacy itself is a vast thing that encompasses personal privacy, communications, privacy information, privacy, all of these, these different functions. I mean, the concept of privacy goes back to the invention of doors. I mean, it, it's a, it's a fundamentally human concept. But when we're talking about privacy laws today, we are primarily talking about information privacy and what information privacy means is an individual's right to have knowledge of and to understand exactly what it is that company is doing with the personal information that they provide to that company or that the company has obtained about them. And they, it it's their right to exercise some measure of control over that information.

(08:50):

So it's this idea that a company has certain, certain responsibilities and individuals have certain rights. So, you know, that the, the reason why this is kind of snowballing in importance that started really with the passing of the GDPR in 2016, it went into effect actually just you know, the four year anniversary was just last week May 25th, 2018 was when GDPR went into effect. And that was the thing that really spurred a lot of companies into action and really put it on the radar of not just legal departments, but also executive teams, boards of directors. It really kind of brought the eyes of the world to Europe because all of a sudden, if you didn't protect data up to a personal data, specifically up to a particular standard, you risked not being able to do business in the European union. You risked fines, you risked losing customer trust.

(09:46):

And you know, this, the, the reason for why the GDPR passed. I mean, the reasons plural, it there's a multitude of them. I mean, there there was a concern over us, government surveillance. There is also the whole surveillance capitalism that is becoming more and more discussed today with, you know, you know, all about that, the ad tech side of things. Right. so there was all of a sudden you have all of these technologies where we are getting, you know, access to the internet and everything that entails for free, but free isn't really free. I mean, what is the cost, right? And so people were suddenly realizing that, Hey, when I look up a pair of shoes and all of a sudden I'm seeing ads for the same shoe when I am on my laptop instead of my phone, and I'm on a totally different website, how did that happen? Right. And so there was this growing awareness of, you know, wait a minute, there's a lot of data collection going on and maybe we should do something about that.

Kirk Williams (10:46):

So let's talk about that data. What determines that's a loaded question because obviously that's, that's exactly what everyone is trying to figure out right now. But maybe even some of this is, yeah. Just give us your thoughts, your opinion, maybe like what determines what data should be included in kind of that like personal privacy security type of an issue and what data is not right. Because at some level, like data's been collected for a long time through businesses, right. Pre-Internet you have that in terms of POS stuff at stores and, and all that. It's, it's funny. I was just talking to a friend and we were, we were laughing about the fact that, like, for me, my phone number to me personally is incredibly private. Like, I, I do not want that out there. And yet, you know, back in the day, I mean, just, you know, 30, 40 years ago, like everyone's phone number was literally public information in this state that you would have delivered to them that no one knows about now, but it's called a phone book. Right. So, so in some ways, like maybe define data a little bit as, as in terms of like, are there ways of identifying what is data that should be, that should be falling into these sort of privacy things? Is it like literally every single aspect of anything that can be collected on someone? Are there some ways of collecting data? That's that's, you know, that's, doesn't fall into that. Yeah. May maybe talk a little bit about data in that regard, if you would, please.

Bethany Singer-Baefsky (12:11):

Yeah, absolutely. So first of all, I should have said, said this at the beginning, but nothing I say here constitutes legal advice. Absolutely. I, yeah, not an attorney. Not writing legal advice y'all are on

Kirk Williams (12:23):

Your own. Me too. Ditto. Yeah.

Bethany Singer-Baefsky (12:27):

Yeah. So just felt like I, I realized, I forgot. I kinda said that, but now that that's outta the way for a lot of these laws and regulations, personal, the definition of personal data is some iteration of data that identifies or relates to an identifiable individual. So that can be things that are directly identifiable things like, you know, full name, if it's a distinctive enough name, email addresses, telephone numbers, social security numbers, obviously sensitive information like your health information or precise location, things like that. And, but then there's also, you, you can have a bunch of sort of indirect identifiers, IP addresses other unique IDs. And, and those have been determined to be personal information because they relate to somebody who is identifiable. And that, that I think is where a lot of the challenges come in here because there is also this concept of pseudonymization or deidentification where you with pseudonymization, you basically replace the identifiable information with something non-identifiable or deidentification where you entirely separate the identifiable data from the, from the non-identifiable data.

(13:46):

so, you know, for example, if you have, I don't know, a list of full names and precise geolocations together, those are going to definitely be personal information. But if you just look at the list of full names, if everybody on that list is named John Smith, that might not be PII. Right? So the, the thing that I always say when it comes to privacy, is that more often than not, the answer is gonna be, it depends, it's, it depends on what the data is. It depends on the context in which the data is collected, how the data is stored, who's using it and for what purposes and also depends on what the regulators say about it. The recent Google analytics case it from coming out of the EU is a great example of that. I mean, I think a lot of people were very surprised when Google analytics data at least in the context of specific context in which it was collected in that case was determined to be personal information because, you know, because it was transferred to the United States and there were IP addresses and the IP addresses hadn't been properly anonymized, blah, blah, blah.

(14:48):

Like there were all kinds of extenuating circumstances in there, but the, the, you know, the message that a lot of privacy professionals received from that was, you know, even if you don't have something in there that is directly identifiable, like somebody's full name or email address, these data pieces that are not identifiable on their own when combined could be considered identifiable data. And so you gotta be really careful with that and always understand the context in which the data's collected, who could potentially access that data and what it's being used for. And so there's, I know I'm rambling on here, but there's this concept of data minimization it's only collect and use the minimum amount of data you need for a particular legitimate business purpose. And so that's a question that I think businesses need to ask themselves for whatever process they're engaging in for using personal data. What am I trying to do? What data do I need to do it with and why, and is there a way that I can achieve the same ends by using less personal information?

Kirk Williams (15:52):

And so it seems like the idea of user consent would play into that as well. Right? Absolutely. And so maybe you can talk to that next, just this idea of, you know, it may be that one user would, would look at an advertiser and say, yeah, it's, it's okay if they have this information about my shoe size, right? Because like, then I can see ads for that specific shoe and I can click through it. It saves me time, blah, blah, blah. Mm-Hmm <affirmative> I don't care if my shoe size, it might be that someone else would be like, no, I don't, I don't want you knowing that I'm like a really large shoe size. Right. That's my information. Right. So exactly goofy example, but the idea of user consent seems to play into that. Can you just talk through what that, you know, how is that involved in the idea of data and if that's, if that's playing into legislation and that as well?

Bethany Singer-Baefsky (16:41):

Sure, absolutely. So consent is, is hugely relevant and it's actually becoming increasingly important, especially this idea of consent before data collection and opt in versus an opt out model. The United States is still primarily an opt out model, but the rest of the world is really moving more towards opt in. And I think you'll see that trend continuing. But the, this idea of providing consent for data use is critically important that I think the thing to keep in mind as well is for, in the EU, it's not just GDPR, it's the E privacy directive, which is commonly known as, you know, the cookie law. But it's not just about cookies. It's not just about tracker. It's not just about cookies. It's about trackers generally. And it's not just about personal information. It's about data from these trackers generally. So you know, you do need to provide consent or you need to give people the ability to provide consent. And the, the important thing about consent as well is that it can't just be, you know, bundled into some huge, you know, terms of service or, or privacy policy, right.

Kirk Williams (17:46):

Facebook

Bethany Singer-Baefsky (17:47):

Exactly like you, can't just, just, you can't just be like, well, you know, I'm gonna click agree to this privacy policy. Now you can have all of my data. I mean, that's not how this is supposed to work. It's consent for specific purposes consent that is as easily withdrawn as it is provided consent, that requires a specific unambiguous action. And then there's the whole dark patterns issue, which actually the FTC is starting to crack down on. So you know, if, and that's where people make a decision that they've been guided to basically by how the page is set up a decision, they might not normally make if they were given better options. So like for example, the most, one of the most common in instances is, you know, you have a cookie banner that takes a path the page, and to get rid of it, you just click agree, but there isn't really anything else you can do. Right. So consent needs to be meaningful in order to be valid.

Kirk Williams (18:41):

Hmm. No, that's great. Yeah. It's, it's funny. So I'm an advertiser. Sure. I, I love targeted marketing, right. I like when I can figure out what someone actually wants so that then we can provide that for them. That requires data. The flip side of that, like, I, I kind of like privacy too. Like I, I have kids, you know, I, I think of a world where, you know, other people maybe that I don't want knowing like who my kids are, what they do and all that. So for sure it it's. Yeah. It's just, I think that's part of what a lot of us are trying to wrestle with are these things. And the user consent thing is interesting to me, you know, we joked about this idea of like, yeah, just stuffing it into this, like massively long legalese terms and conditions that everyone in the world says they read.

(19:24):

And no one in the world actually does read in order to participate in something, let's say like a Facebook thing. So it, it allows someone like a platform, like a Facebook to say, oh, Hey, legally, we're totally okay. They said this, one of the interesting things to me is like, over time, like that, that has changed. So like, so I started, I started using Facebook in the very early days. I mean, I don't remember exactly when it was, but I'm pretty sure. I, I forget when it came out, I wanna say it was like that I was using it in like, oh 6 0 7, something like that. But when, whenever, you know, early days, yeah, yeah, yeah. The college invite only type thing, Uhhuh. And, and like my understanding when I first, like, you know, let's just say first thought of, okay, here's my awareness of what this thing is.

(20:12):

This entity is. And yes, I agree to this entity, right? Like my understanding and awareness and agreement to that has changed significant, like, like that, that was that at that point in time, the amount of data even that they collect or utilize, or just the sheer capability of how their machine learning has grown and that sort of things and what they can use and identify, you know, in that 10, 15 years is, is, is incredible. And I think that's progressed a lot faster than most people's understanding of that capability, like the average person on the screen. Definitely. So even if Facebook keeps shoving in terms of, you know, new, new terms and conditions at someone in an email and they keep saying, sure, click, sure, agree, agree, agree. There, there seems to be a disconnect there with someone's actual understanding of what they're agreeing to, especially in a system like Facebook over time that has actually evolved and changed. And it, and it really is fundamentally different than what they originally agreed to. So,

Bethany Singer-Baefsky (21:14):

Oh, hundred percent. No, but you actually raise a really great point with that, especially as it pertains to privacy, because there is this principle in privacy basically called purposes of processing. And so if you provide consent for one form of data processing, that doesn't necessarily imply that the company can just use that consent for different purpose of processing. Twitter was actually just fined for this. Recently, they, they took phone numbers, I think it was for account verification, but then use those phone numbers in targeted ads. Those are two separate purposes of processing. And if consent was provided for purpose a and not for purpose B and purpose B, isn't obviously compat compatible with purpose a, you can't do that at least under GDPR. That's actually what you brought up. I, I think really is reflective of that principle that this system, you know, existed, this platform existed data was collected for a specific purpose.

(22:13):

And as the platform evolved, data collection practices evolved. And it begs the question of, okay, as these practices evolved in these, you know, privacy settings became increasingly granular within the privacy setting section. I mean, is that, is that enough? Is that valid consent? I mean, it, you know, it's, it's hard to say I'm, I'm not an attorney. I'm not gonna make any pronouncements about <laugh> what Facebook is or isn't doing, or it, it is, or isn't legal, you know, I, don't not gonna step in that quick sand <laugh>. But I think you bring up a really valid point. That's really tied in a lot to a fundamental and, and really foundational privacy principle.

Kirk Williams (22:48):

Yeah, yeah. Again, again, going back to the fact that like, certainly certainly it's, it's been the wild west, even us like yes, digital marketers. There's, there's been a lot of freedom. And typically I think a lot of digital marketers are at least sometimes culturally they're seen as almost like the, the bad, the bad actors, like wanting to use the data to advertise in that. And I think there's a lot of us who are looking around and we're like, gosh, like, no, actually I I'd like to participate in a world where like, there is safety and security, cuz that actually is gonna make it better for all of us, because if there's absolutely no trust involved, like we also can't sell our product and services well, so yeah, I, I just think like figuring this out and, and, and getting to a place where the internet remains a safe and secure, I shouldn't even say remain, maybe like progresses it, it evolves into a healthier version <laugh> as opposed to a little bit of the wild west of, of like whoever can collect stuff.

(23:46):

Yeah. So anyways, well let's, well, let's, let's talk a little bit maybe on the advertising side and, and so I'm, you know, this is, this will be interesting for me cuz like that's, you know, that's my industry advertising, I'm sure you know, a bit about advertising and data, but primarily you're in the privacy world. So let's, let's think kind of through, so we have, we have these entities, at least how I understand it, where you have these different entities who are involved. So you have, you may have like the platform itself mm-hmm <affirmative> so like, like by platform might just mean, you know, like Facebook, Google, that sort of thing. Right. You might have the, the actual advertiser. So let's just say the brand maybe who is advertising their product, the brand might use multiple third party vendors for different things. They may have a CRM like HubSpot, they may have an agency like us, right. That places PCs mm-hmm <affirmative> and then you obviously have then the actual user as well with their data. So my, so one of my questions and the, and I, and I apologize ahead of time cuz some of this is I'm just, I'm pondering through these things. And so I,

Bethany Singer-Baefsky (24:56):

Nothing to apologize for this is great.

Kirk Williams (24:58):

So like adapt like adapt what like adapt. However I ask the question or say it to, to how you wanna talk through it. And that is like, as we think through this idea of like data compliance and that who is responsible in that instance, in, in an instance with like multiple people utilizing user data, like who's responsible for the who's directly responsible for the privacy of that user data. I mean, does it is involve where it's stored physically with like servers, does it involve visual access? Like if one of my employees sees an email list of, you know, from a customer in order to send it to Google, is there some sort of privacy liability there? Yeah. Maybe talk through kind of the like compliance and who's responsible for the data and, and liability and all that, if you would.

Bethany Singer-Baefsky (25:47):

Yeah. So from a, from a very high level, anybody with access in any way to personal information is responsible for the protection of that personal information in accordance with whatever regulations they may be bound by. You know, I can't really speak to liability without seeing contracts and you know, all of that. There's if you want, if you want insight into liability, check out article 82 of the GDPR, but the responsibility for data protection really rests with anybody who has access to the personal data. So your employee who, who is viewing spreadsheets containing customer emails they shouldn't have access unless they need access to do their job. So there should be access controls around that, that email list. And they should have the training to know that they shouldn't take a screenshot of that and post it on Twitter. You know, where you're storing the data of their, you know, in most places would need to be a contract in place to ensure that it, you know, they are storing the data in accordance with your instructions and not processing it for further purposes without getting their own consent, et cetera.

(26:55):

So every step along this journey of personal data requires proper data stewardship. And from a compliance standpoint, anybody who's involved is responsible for compliance. You know, I, I think of it really in terms of more of that stewardship perspective rather than pure compliance. And that is for, for two main reasons, the first is that it just kind of creates more of a culture around data awareness and creates more empathy, right? If you know that it could be your data that somebody else is responsible for and that everybody's responsible for everybody's the, that that to me is a good thing. The other part of it is really from a business perspective, which is that if it's treated as purely a compliance thing, compliance fatigue is real. And you know, there are new laws coming out all the time around this stuff. And you know, people don't wanna jump from a GDPR compliance program to a CCPA compliance program, to a China compliance to a Japan compliance to all of these different areas and what if you're operating in 50 different states, right?

(28:05):

So if you understand what the underlying principles are for protecting the rights of data subjects, if, if you understand that there are these underlying principles that undergird and support all of these different regulations, things like consent, transparency, notice data, minimization, purpose, purpose, limitations security of transfer, security of storage, these various principles that goes a long way. If you implement those to actually complying with a wide array of regulations and that way compliance is pretty much your outcome and you can make, you know, some regional sort of variations as needed as opposed to having to undergo compliance initiatives every single time a new law is passed.

Kirk Williams (28:50):

It, it certainly seems that a business who is who decides to be more on the safe side than the sorry side probably is in a better place right now,

Bethany Singer-Baefsky (29:00):

Definitely

Kirk Williams (29:00):

Being wiser

Bethany Singer-Baefsky (29:03):

And being, being proactive to cuz you just, you save yourself kind of the compliance version of tech debt.

Kirk Williams (29:09):

Okay. So what I'm hearing is like, let's say small business, small agency, really anyone mm-hmm <affirmative> listening as they kind of try to process like what do we even, what, how should we even think about privacy? Like what should we even do? It sounds like at least a couple of things to immediately do would be probably making sure someone, anyone is at least in charge of understanding these privacy aspects and like what needs to happen. And then secondly, like training people who would have any sort of access to user data, perhaps just trading everyone in the org, especially if smaller sounds like at least those two would be really important steps.

Bethany Singer-Baefsky (29:50):

Yep. Those are important steps. I would add, check out, just do a basic Google search on privacy frameworks and see what framework works for your organization. And the reason I suggest this is basically what I was saying before is that if you can find a framework that, you know, elucidates the various privacy principles and you can implement policies that align with those principles, you're already like 80% there, at least when it comes to compliance with most major regulations. So there's, you know, there's ISO 29,100 there's ISO 27, 7 0 1. There's the trust anonymity accountability framework. There's the generally accepted privacy principles framework as well, which is a little bit older, but still has some really good stuff in it. Those align really well to kind of the fundamental points in a lot of different regulations, ISO 27, 7 0 1 maps to GDPR. So that's really helpful. And you know, there's a lot of flexibility in terms of implementation there, creating controls, you know, you can align them to your own company's needs. So I personally, and again, I'm speaking for myself, not for my company, not for, you know, as a legal thing, but I just personally advocate a framework based approach to privacy because that puts data subject rights at the forefront. And it also just makes things a lot easier for the business.

Kirk Williams (31:15):

Hmm. No, that's, that's really good. And, and you know, something for small business owners listening, I mean, you know, with our business insurance, for like my agency at privacy as part of that, and actually one of the requirements is that we have to, you know, have documentation about some sort of training framework, you know, that sort of thing as well. So it even you, it even is you know, plays into that sort of thing too, so, oh, okay. And then let's see data, data ownership. So let me talk out loud here for a second, as I kind of, as I kind of think aloud I've, I've thought a little bit about just like data, right? So as I understand it, the user owns their data and at least that seems to be part of what all of these laws are about GDPR, everything about that is kind of this the right to be forgotten.

(32:06):

Mm-hmm <affirmative> just this idea that like, no matter like things seem to be moving in a, in a place where it's, it's generally agreed upon that if like an individual's kind of like, Hey, you know, my phone number's out there on the web and I don't want it to be, it kind of doesn't matter where and how that phone number got out. There seems to be an agreement that like that user owns that personal data, whatever. So some of where I get intrigued and a little confused is like, what about after that? So this user agrees to the terms and service, the terms and conditions let's say of Google. And part of Google's thing is, Hey, as part of what we do, they'll, you know, they'll always phrase it in a positive light, it's part of what we do to provide you a better and more personalized experience, review targeted ads, you know you lucky dog <laugh> and and so they'll, you know, they'll frame it like that.

(33:04):

And so, you know, the user says, yep, sure, I'll do that. And then of course, then the, the course can, you know, argue over whether or not the user fully understood everything that was gonna be included in that blah, blah, blah. But all that to say at some point, then an advertiser comes along, pays Google for access to that data. Hey, I wanna target males between the age of 25 and 35 who are into golf so I can sell them these golfing club, these golf clubs. Right. So then the advertiser pays for access to that data. What role does, does Google have in that regard? Like what are, are there any rights that Google has with that data? If the users said, yep, we agree to, we agree that you can track that we're golfers. So does Google have any rights to that data? Do the advertisers have any rights for entering a relationship with Google saying, Hey, we're gonna pay you to use that data as part of, again, as all part of this, like fully transparent, legalized, like this is the way this sort of targeting works. Like what, what does, what does that look like outta curiosity? And, and yes, all the caveats we know. Yeah. You're not, you know, all that. You're not a lawyer, not your company, all that. Yeah.

Bethany Singer-Baefsky (34:18):

But yeah, no, it, it is a good question. I think a lot of it depends on the, the, the type of consent given, right? So if somebody, somebody gives consent because they're consenting to a contract, then that's a different lawful basis of processing for, you know, cons actual consent consent, right? So you consent, you have the ability to withdraw consent as easily as you have to give it as opposed to willingly entering into a contract. Right? So those are kind of two, two separate things, but assuming it is fully consent based yes, I agree that you can use this data because I want to see different advertisers. You then have the right to withdraw consent. And when that consent is withdrawn, the data controller, so the company that collected your data in the first place has to stop processing that data.

(35:08):

And normally, unless they're using it for another purpose that they are able to lawfully use it for they would also have to delete it, right? So when it comes to Google, then sharing the data with an advertiser, this is actually where some of the interesting kind of privacy sandbox stuff comes in that Google's doing right now, you know, things that are potentially browser based or that use differential privacy or cam anonymity, but without getting into kind of all technical stuff there, you know, the what, what, what you described almost sounds sort of like what they're doing with topics of, you know, we have this aggregated anonymized data set mm-hmm <affirmative>. And so it really would depend, like it really would depend on, on basically what data is shared and how it's shared, if it's anonymized and aggregated, then more than likely it's not personal information.

(35:58):

However, anonymization is a tricky thing, right? It has to be completely non-identifiable and that's extremely difficult to achieve. So in general, when somebody submits a right to be forgotten request, they say, I want all of my data deleted. If I submit that to company a and company a has shared data with companies, B, C and D you know, in order to provide new services or whatever, it's up to company a to then filter that request out to companies B, C and, and D. So, you know, theoretically at least, you know, based on this example you know, Google would need to make sure that whatever ad partners they shared your identifiable data with was then deleted. But again, if it's part of an anonymous aggregate cohort, maybe that's not gonna be necessary.

Kirk Williams (36:47):

Interesting. Yeah. So in that way, it, it might be that the right to be forgotten really only does apply to the, the PII does and not necessarily does. OK. Okay.

Bethany Singer-Baefsky (36:58):

Yeah. These are all, these are all personal information rights. And so what, what it ultimately comes down to is, is the data in question, personal information.

Kirk Williams (37:05):

Okay. And so there is the, is there a clean line in rights in terms of like only the person like that person in their PII? They have all the rights, like literally no one else has any rights. It doesn't matter if they agreed to give it to Google, Google, you know, Google used it or bought it or whatever, they have the right to be forgotten of that PII, no matter what. So in that case, like, I guess, like Google really doesn't have any rights over that data at all. Even if like, like let's say they purchased it from them, it sounds like.

Bethany Singer-Baefsky (37:37):

I mean, it's, I, I wouldn't, I wouldn't necessarily go that far because consent isn't the only lawful basis of processing, you know, ah, if you, for ex for example, if, if I buy something from you and you have my, you know, payment information on file and I, and I put in a right to be forgotten request, there might be a legal reason why you need to keep certain pieces of data, you know, for your warranty records or for tax reporting purposes, there might be a legal obligation there that would override. So privacy rights, like other human rights are not absolute. And so there, you know, there is that, that tug of war between other other laws. Now that doesn't mean you can just get around everything by making everyone sign a contract. <Laugh> like, that's not, that's not gonna work either. You know, but for things like, like employment records, that's an area where it's actually creating a lot of, of challenges right now because, you know, California, the, the exception for HR data is, is gonna expire at the end of the year. And GDPR already gives employees GDPR rights in an employment context. And so, you know, if somebody implements a right to be forgotten request with their employer, well, what does that mean for employment? Right. I mean, there are gonna, there are, there are employment laws in place that would prohibit certain types of data to be deleted, presumably. And so it would be, you know, working with your legal teams to figure out, okay, what data do we actually have an obligation to retain?

Kirk Williams (39:02):

Wow, huh. Yeah, no, that, I've never thought of that before you just blew my mind. I mean, I mean, that's even, yeah. Like let's just say even something like, like a crime database or something like that where someone's like, yeah, I don't wanna be included in that way. Like, well, yeah. I mean, like you did the crime, like we have to know, you know, that has to stay in there.

Bethany Singer-Baefsky (39:23):

Right. And you could argue that, you know, public, public interest, that's the term I'm looking for public interest may override privacy in that case, you know, maybe it's a particularly violent crime. Right. So that's something that ultimately honestly, a court would probably decide that somebody would probably Sue and say, Hey, I, you know, my record should, was expunged and take me off this data, you know? So that, that's something that would be a legal question, but yeah, there's, you know, there are public interest exceptions. There are, you know, conflict of laws type things that come into play. A actually, you know, one of the exceptions in, in GDPR is that, you know, you have a, a, a legal obligation from the European union, right. So that you don't have to delete the data because European union law requires you to keep it. Now that doesn't mean indefinitely. It would very likely mean that at the end of that mandatory retention period, you delete it, but we're supposed to delete at the end of mandatory retention periods anyway. So, you know, it's not just a privacy thing. Retention periods exist for, you know, a lot of areas for very good reasons.

Kirk Williams (40:27):

Yeah. No, that is, that is, that is absolutely fascinating. Wow. Okay. Lots to think about one, one of the things that, I mean, I, I kind of knew this for the direction. We wanna take our main episode, but especially as we've been talking I talked to another guy, he was more on like the ad tech side before, too. She, you know, sharp guy last week. And it, it just seems like it would be the most fair way to present any sort of privacy episode is basically just like not obfuscating, how complex everything is. <Laugh>

Bethany Singer-Baefsky (41:01):

Yeah. The more I talk about the more I'm like, oh, wow. Like, no wonder I'm tired all the time. <Laugh>

Kirk Williams (41:06):

<Laugh> there are so many, I mean, here's, here's the deal, right? Because you know, a lot of, a lot of people, you have a lot of lawyer jokes out there and all that, but one of the reasons why we need lawyers in our society is because like this stuff, there's no easy answers to a lot of comp to complex stuff. Especially once you get into stuff like this, like privacy. Yeah. And that's where, as you've noted, just a few times of like, oh, well, Hey, well actually there's this exception. And then there's this exception and there's exception. It's like, that's, that's why sometimes it takes like a two year long court battle to try to figure out like, what was the best outcome in this unique situation. And then another unique situation comes up. So,

Bethany Singer-Baefsky (41:46):

And it's always, the answer is always, it depends always. And I, I I've said that in, in speeches, I've said that in job interviews, I said like, anybody ask me anything, privacy. Well, it depends.

Kirk Williams (41:57):

<Laugh> <laugh>, that is funny. I like, I don't know if you know this, like, that's a very common thing in digital advertising too. Like

Bethany Singer-Baefsky (42:05):

Say we have more in common than you think it,

Kirk Williams (42:07):

It really is. It's funny. It is very funny because it's just one of those, well, why, why did my numbers do this well? Or like, like, yeah. If I, if we do this for an ad, is this the, is this the return on investment we'll get well, I mean, it depends. <Laugh> like, there's all these, these other 15 factors that influence it. So anyways, let's see, I'm trying to think if there's any other you know, maybe one last question would be just kind of the future of privacy as it relates to advertising. So you did bring up the sandbox, which is go to the privacy sandbox with Google. You know, they, they, they announced flock, thought the world was gonna applaud them and slap them on the back. And then the world, like promptly quickly rejected flock, which by the way, do you have any do you have any thoughts on, on that? Like, were there specific reasons that, that you, that you, you know, thought flock was unhealthy or that the world thought flock was unhealthy outta curiosity? If not, that's fine.

Bethany Singer-Baefsky (43:05):

Yeah. I mean, I personally, I found it more interesting than concerning and I really kind of was curious to see where it would go, but I, I think that topics kind of, sort of pulled it back into a more privacy protecting mode. I, I think the issue with FLOX as far as, you know, my perception was that it wasn't really as privacy preserving, as people thought that it was sort of a rebrand of kind of existing practices and maybe that's incorrect, but I think that was kind of the general perception, but I think topics kind of, I think, does more of what Fox claimed to do, if that makes sense.

Kirk Williams (43:38):

Where, where do you see personalized advertising morphine over the next 10 years, if, if it does with, because of this?

Bethany Singer-Baefsky (43:47):

So I, first of all, I think that's a great question and it, I think it's one that, you know, we're all kind of wondering about, and, and honestly we could make it into a drinking game, the number of options there are to start taking bets. But you know, personally, I don't, I don't think attempts at personalization are gonna go anywhere. I, I think fundamentally people do like that. If they have to see an ad at something that's relevant to them now, personally, when I see ads that are not relevant, it's mostly entertaining, but you know, I've actually bought things from targeted ads before. I mean, they're not, you know, they're not in and of themselves a bad thing. I wouldn't even call them a necessary evil. I don't think they are evil, which may be blast to me as a privacy professional. But <laugh>, I, I don't think that in and of themselves, it's a bad concept.

(44:38):

I, I, I think the issue comes with a lack of transparency from website operators and from ad tech providers. I, and I think people don't understand, like you mentioned before they don't understand what it is that they're necessarily consenting to. And don't understand the data flows. I mean, I was in a, a, a workshop at a privacy conference in April that was four hours long on ad tech. And I considered myself, you know, kind of well versed before it, I came out after and what, oh my God, I knew nothing. I mean, it's so complicated. Like, what you do is so complicated and, you know, and I think the laws aren't, they haven't caught up yet with the technology and the technology at the same time, weirdly enough is also trying to catch up with the laws. It's a weird kind of inverted race that's happening.

(45:24):

And I, I think attempts at personalization are not going to stop because there is, there is value there. And I think both consumers and advertisers recognize that value, but it has to be done in a privacy forward way, in a way that protects people's rights. And, you know, I think that privacy enhancing technologies are going to continue to evolve. You know, the privacy sandbox like you mentioned, is a place where some of that experimentation is happening. And I see a lot of potential there, especially in browser based technologies technologies that, you know, exist on the individual's device as opposed to being shared externally. So there's certainly, I think, potential for evolution in that respect and, and that's kind of where I see that going.

Kirk Williams (46:12):

Hmm. Yeah, no, that, that completely does my heart good as, as an advertising professional <laugh> no, I mean, you know, that's, that's exactly it, right. Is you know, as you noted, I mean, part of, part of why I like advertising is because when it's done well and correctly, like you're, you're actually meeting a need. Yeah. So you, you definitely have the idea of marketing can kind of be this negative thing, you know, especially if it kind of like tricks or encourages or uses like powerful, emotional ways to kind of almost trick someone into buying something they didn't want, you know, that sort of thing. You don't totally agree. I mean, that's, that's not necessarily like, that's, that's not healthy. It's not healthy to guilt people into buying something. You know, you don't, you don't, you know, you don't look, you don't look good enough, so you need this, so you look better.

(46:58):

Right. Those, those kind of things are sometimes reasons why I like the advertising industry rightfully has a black eye. The flip side is, you know, all of us, just, as you said, I've experienced a time where man, like, we actually had a need for X, Y, Z. And because of that we saw an ad for X, Y, Z, and we were like, that's amazing. I I'm gonna buy XYZ. Mm-Hmm <affirmative>. And, and everyone's happy the user, the advertiser, the brand, all that. And, and the flip side though, and this is something I'm really taking away from our conversation more and more, is this idea of just how important, you know, proper user consent is in this whole discussion? Cause cause the flip side is like also, Hey, if someone, if someone doesn't necessarily know what information is being used or they don't necessarily want that to be used, especially if, you know, when it's their personal identify that PII then it's, it's not right for that to be used. So

Bethany Singer-Baefsky (47:54):

No, absolutely. And, and I think, you know, the thing, that's the thing that's gonna be important to keep an eye on is technologies that kind of try to get around some of the consent issues. Mm-Hmm <affirmative>, you know, I've seen, I've seen a lot of technologies where the selling point is, well, we can, you know, measure conversions through, opted out users that understand the value proposition, but they're opted out. So how are you doing this? Right. Mm-hmm <affirmative> so that's something I think to keep in mind is as these technologies develop, you know, how is the data actually being used? If there's talk about, you know, using data from opted out users, what data is being used, how do we know that they've, that people have opt out what's the opt-in optout process? You know, and how do we transparently engage with users so that they understand what it is that they're opting into. Right. Mm-hmm <affirmative> and I think that's kind of critical here is there's gonna be, I think, two kind of technology branches, there's gonna be kind of that sandbox approach of, you know, privacy enhancing technologies, browser base, et cetera. And then there's gonna be maybe more like pixels or APIs or, you know, things that kind of go around the browser based consent modules that maybe require a little bit more scrutiny because of, well, how are you actually managing opt outs?

Kirk Williams (49:19):

Hmm. Yeah. Good stuff. Bethany, thank you so much for joining us. What a, what a pleasure. I feel like I learned so much, so thank you.

Bethany Singer-Baefsky (49:30):

Oh, thank you so much. It was great chatting with you.

Chris Reeves (49:33):

This has been a bonus episode of the PPC ponderings podcast. Keep checking back for more interviews and our next full episode. If you like, what you hear, please consider sharing this with your network and leaving us a review on apple podcasts until next time made the auctions be ever in your favor.

Want more free content like this delivered directly to your inbox?
Subscribe Here
Kirk Williams
@PPCKirk - Owner & Chief Pondering Officer

Kirk is the owner of ZATO, his Paid Search & Social PPC micro-agency of experts, and has been working in Digital Marketing since 2009. His personal motto (perhaps unhealthily so), is "let's overthink this some more."  He even wrote a book recently on philosophical PPC musings that you can check out here: Ponderings of a PPC Professional.

He has been named one of the Top 25 Most Influential PPCers in the world by PPC Hero 6 years in a row (2016-2021), has written articles for many industry publications (including Shopify, Moz, PPC Hero, Search Engine Land, and Microsoft), and is a frequent guest on digital marketing podcasts and webinars.

Kirk currently resides in Billings, MT with his wife, six children, books, Trek Bikes, Taylor guitar, and little sleep.

Kirk is an avid "discusser of marketing things" on Twitter, as well as an avid conference speaker, having traveled around the world to talk about Paid Search (especially Shopping Ads).  Kirk has booked speaking engagements in London, Dublin, Sydney, Milan, NYC, Dallas, OKC, Milwaukee, and more and has been recognized through reviews as one of the Top 10 conference presentations on more than one occasion.

You can connect with Kirk on Twitter or Linkedin.

In 2023, Kirk had the privilege of speaking at the TEDx Billings on one of his many passions, Stop the Scale: Redefining Business Success.

Continue reading

Find what you're looking for here: