The signature is just the beginning. DocuSign's new Intelligent Agreement Management Platform offers a modular approach, enabling developers to build scalable, tailored solutions for the comprehensive suite of APIs and tools that span the entire agreement lifecycle. Learn more and sign up for free at developers.docuSign.com.
Hello, everybody. Welcome to the Stack Overflow podcast, a place to talk all things software and technology. I am Ben Popper, director of content here at Stack Overflow, joined by my colleague and co-host, Ryan Donovan, editor of our blog, veteran technical writer, API docs, Wrangler. Ryan, one of the things I know most about our audience at Stack Overflow, the folks who make up the community and the people who take the dev survey is that they care deeply about digital privacy. Yes.
They do not like cookies. They do not like trackers. They like to use anonymous browsers. They are, you know, kind of folks who have the skills and the awareness to, you know, really, really put some effort into controlling how their digital information, their PI gets around. But that's not true.
For most people, I am the technical advisor to several folks who are 20 years older than me. And I don't think they really know the first thing about digital privacy. So today, we are lucky enough to have a guest, Suki Gulati-Gilbert, who is a public interest technologist and is the senior product manager on Permissions Lab.
which is a product from Consumer Reports, an app that helps users exercise their digital data privacy rights. So very excited to chat about this. Suki, welcome to the Stack Overflow podcast.
Great to be here. So full disclosure, you now work with Khaled El-Khatib, my old boss, who was at Stack Overflow and now is at Consumer Reports. And he wanted to connect us because he thought this app would be really interesting to our audience. Makes a lot of sense to me. But he also mentioned that Consumer Reports is like, or this part of it is a nonprofit or a public good. I wasn't aware of that. Can you tell us a little about that?
Yeah, consumer reports, the whole organization is a nonprofit actually, so long rich history of protecting consumer rights and advocating for all types of changes that make consumers safer. Consumer reports has believed for a long time that privacy is a fundamental right. And so our
exploring how they can help consumers protect their digital privacy. And one of those efforts is this application for mission slip. Nice. Okay. I was not aware of that. So yeah, I guess let's just do a little bit of background. And then we'll talk about what you're working on now. I feel like you have an interesting set of experiences when it comes to privacy.
because you spent four years working at a large social network and doing both software engineering and managing engineers there, but then went on to graduate research and a technology fellow, Center for Democracy and Technology, and now at Consumer Reports. So talk to us a little bit. Yeah, I guess about what was formative in your relationship to being a software developer and thinking about digital privacy.
Yeah, a bit of a long windy road to get here today. So started off in undergrad studying human computer interaction and really specifically how computer systems could help activists and change the landscape of advocacy and activism.
And that's how I came to, as you mentioned, a very large social network, because there was a lot of optimism at the time around how crowdsourced media and connecting people and live streaming could shake up dominant streams of information and bring new perspectives to light. Had the great fortune there of working on
local news, overall news, as well as public safety. I learned a lot about what you can build when you have access to a lot of data.
And one of the things I came to believe in my time there is that I'm not sure that so much data should be centralized within these corporate silos, if that makes sense, and became very interested in technical interoperability as well as user control of data flows, because it's just so obscure. It's so obscure.
Within a company, it's so obscure across companies. Everything we do on the internet is trapped and sent to tons of places that consumers are not aware of. And that was extremely clear to me as an engineer and even more clear to me as I took hiatus from the corporate world and working as an engineering and engineering manager to go over to MIT and study privacy.
So at MIT, I studied specifically privacy compliance and the interaction between privacy legislation and what that actually means when it gets translated into code. It turns out that if you don't design a system with privacy in mind, it's really, really difficult to retrofit a system to understand things like data provenance and where the data is flowing and ensuring that it's effectively deleted.
So that became really interesting to me that you had these legal frameworks like the GDPR, which mandate really clear things like effective data deletion, the right to be forgotten.
But the gap between implementing that in code is pretty difficult. And one of the very few things consumers can do to exercise more pressure to continue to work on these technical problems to ensure that privacy is respected in not just systems that are built with privacy in mind, but the existing systems that have our data is
is continuing to push companies on exercising these data rights, right? Continuing to ask that your data is deleted, that your data is not sold. So that's how I came to work on this at Consumer Reports. Permission slip is working on that exact problem, taking these legal rights and making them accessible, making them easier to exercise.
Yeah, that's great. You know, I know it seems like every week there's a new bit of news about digital privacy breaches or worries. You know, there's a 23andMe sold to a private equity firm. There was a breach that leaked everybody's social security numbers. There was one a while back where I think all of the therapy recordings from Finland released online. Yeah.
So how does somebody, you know, in the face of all this private data being leaked to the web with a dark wind? How does somebody manage that and work with that?
Totally. So as you alluded to at the beginning, data minimization is always the most effective, right? So the extent that you can be using private browsing mechanisms and interacting primarily with trusted sites and ensuring that your data is being contained, that's the most effective. That's not realistic for most people. It's a lot of work and
Sometimes we like to think of privacy as more of a hygiene feature in the sense that you're not using a site to be private. You're using a site to buy something or to entertain yourself with a quiz. And you're not primarily browsing the internet with privacy in mind. So what do you do when your data is out there? What do you do now that data brokers have your data? We can talk more about data brokers too. And that's where permission slip comes in. So a little bit of the legal history here.
In the US, the seminal piece of legislation that went into effect in 2020 was the CCPA, California's Privacy Act, which codified three important data rights, the right to know what data companies have on you, the right to asset that data be deleted, and the right to ask that the data be no longer sold or shared.
Exercising these rights, there's varying degrees of success depending on what state you live in and whether your state has a privacy law. California is no longer the only state with the privacy law, but it's still not the majority of states by any stretch of imagination and also how compliant a company is. But it's still the primary tool consumers can exercise is really just asking companies not sell their data, not show their data to get that data deleted.
to go ahead and reduce your digital footprint in that way. And that's sort of where permission slip comes in is we ran a study, Consumer Reports ran a study, which found that on over 42% of sites tested, consumers could not find the link. At least one of the three consumers could not find the link to exercise these data rights. And so we're talking about this in a setting where these people are aware of the right.
They're looking to exercise it. And it's still difficult to go to the privacy policy, find the fine print.
and go to the link to ask that your data no longer be sold or shared or to ask that it be deleted. So while these legislative rights exist and exercising them as one of the few tools in our toolbox, it doesn't mean it's easy to do. So I think what we've done is really an exercise and usability and an exercise and interface designed to take these legal rights that connect with and interest people have and put them in a format that is easy to use and easy to execute.
Yeah, I guess, you know, I've become kind of a little bit cynical or nihilistic about the whole thing. Like Ryan mentioned, I'm always hearing about, you know, massive leaks of social security numbers, credit card numbers. You know, I get occasionally pieces of paper mail that say, sorry, you know,
We leaked all your data and you have this right to have us go and try to clean it up. But the amount of work it would take for that just seems overwhelming. And so I operate on the principle of I have to trust my bank and my email provider and a few other folks who do have good security use 2FA and hope that if somebody basically gets access to my stuff, they'll be caught before serious fraud can occur that
if fraud does occur, somebody else is going to eat the cost because it feels impossible to me to take control of my, my personal data. You know, when I go to like, have I been pwned, you know, it's just, it's so numerous. It makes, and it's, it's, you know, apps and services. I haven't used in 10 years that I downloaded one time, you know, and so I guess I feel a little bit hopeless in that sense, but maybe from your perspective, if I were to go sign up for permission slip, what kind of things do you think it would tell me in, you know, in terms of, right, like,
Hygiene, a prophylactic hygiene versus like repairing damage done or shutting off avenues, you know, that might potentially be harmful to me. What am I looking at? Yes. Okay. Quite a few things to say here. The first is that totally agree, you know, it can be really overwhelming and create
A feeling of cynicism and lack of control because it is so obscure where data is flowing, who has control of it, and what they're doing with it. Actually, when you download permissions, we will do it for you. We send these requests for you. That's what we mean when we say making data rights accessible is that in many companies on the application, we have a free tier and a premium tier.
For a lot of companies, we will still send the request for you because we're mission first and making it easier to take action here in the cases where it's a bit expensive for us to do that. On the free tier, we will direct you to the link to do it quickly. And then on our premium tier, we're instantly sending requests to hundreds of data brokers for you. So the data brokers are the ones that make a profit by buying and selling and exchanging your data.
which brings me to my second point here, which it can be really overwhelming. And that's why I think the goal is not to prevent that your data ever gets breached or leaked, because I think that's a sort of unrealistic standard of success in the world that we're in where it's kind of everywhere. But I think reducing your digital footprint can not only reduce the risk of breaches, but also reduce the number of those election tests that you're getting, reduce the risk.
of your data being sent to your insurance provider or coming up when a future employer runs a profile in you from various people search sites. So there's these hidden effects that's not just a data breach leaking your data, but very real inconveniences to your daily life in terms of the number of tax or spam emails you're getting.
the insurance premiums you're paying, that reducing your digital footprint can pay out these dividends in different ways. So it's still worth doing it on our end. We see decent fulfillment rates better than you might expect and everyone counts. The other advantage is because we see the volume on our back end, you get benefits of collective action. So even when you send your own requests,
through permission slip on our end, right? We can see there's no privacy law in Utah, but we have 10,000 requests from people in Utah to this company. Can we tell them? Can we say, hey, there's 10,000 of these, given the sheer volume, would you comply with it? We have a little bit more leverage to bargain on your behalf.
I still think it's worth it, especially because we're kind of out here doing as much of the heavy lifting as we can for you. I think the goal is not necessarily preventing every instance of data leakage because it's just an inscrutable C at this point, but that reducing your footprint, it pays off in open space. You had me at less spam text, but I'll let Ryan take it from there. Yeah. You mentioned data brokers, and I think that brings up a complication because you can go to every website you've gone to and say remove my stuff.
with the data brokers, you don't know who has your data at that point. Like, I think the last big social security leak was some sort of data broker, some company that
no one ever heard of. How do you even find out where your data exists? Great question. So there's a lot of work being done to bring more transparency to this very obscure layer of the internet, the data brokers. So there's various state registries. California's, for example, has hundreds of these data brokers listed. Legislators are working to increase the penalty of not registering for these sites. So thanks to
advances in the regulatory landscape that makes it easier to even know who these people are. Our approach, there's different approaches to this. Our approach is to get deletion requests out to a bunch of these entities. And what we do in our testing is when we manually send these out, because we'll manually send these requests out on our behalf, for example,
to see the efficacy as we're onboarding different data brokers onto the platform. Some of them do have publicly searchable databases. So in some cases, you can search yourself before, send the request, search again to hold yourself accountable. Sorry, not hold yourself accountable, but hold that entity accountable to actually having removed your data. But it's definitely a bit of an adversarial landscape. So some of these opt out requests, for example,
You actually have to search yourself, find the specific record number on their site, and then send the opt-out request for that record number. And if they don't have a vocally searchable database, you just send the deletion request anyway.
you can't know for sure until they respond to you, letting you know if they found a match or did not find a match. So yeah, there's a few challenges, right? First is knowing who they are, the regulatory landscape as well as experts in the space have helped shed light on that. The second is knowing if they have your information. Some portion of them are publicly searchable. Some portion of them are not. And then the third is ensuring some verifiability in getting your information removed.
And as I mentioned, there's some things that we can do right now, which we do, including following up and checking again. But the second, there is a whole space of research. I know some folks over at MIT who I used to work with are working on it in terms of data traceability and actually creating some kind of audit trail so that we can get some sort of technical verification that data was actually being deleted. But of course, that work is farther out.
Yeah, there was a point in time where I was a journalist and I wrote some stories that were not very flattering to someone with a very large number of fans online. And then odd things started to happen in terms of people are applying for things under my name. And I was getting strange phone calls and the number of spam emails. And so the company at the time that I was working for, you know, when this happened to their journalists, they would get them like a, they would enroll them in a year of something called delete me.
Like, how would you compare delete me and to the product that you built? And I guess, yeah, you know, you kind of were just getting at this. To what degree is this a one and done versus, you know, an ongoing, it's like, how many times a day do I have to brush my teeth and how many, you know, if I miss this many weeks in a row, that's when I'm going to start to get serious cavities. The primary difference between us and delete me is that we are run by the nonprofit consumer report. So our angle is a little bit less.
crisis control, you're getting doxed and let's try to immediately remove all traces of you. Although we do our best to do that too, don't get me wrong. But the goal on our end also is a little bit more ecosystem level in terms of getting more privacy legislation passed, contributing to consumer reports, overall efforts to create
a fair and free marketplace. So as I sort of alluded to, if we are seeing rates of noncompliance, we're actually filing enforcement complaints with those agencies. If we are seeing that there's interest in a particular state for privacy law and our advocacy team is telling us that there's momentum in that state, then we're working with them
To contribute to getting that privacy law passed. So when you're when you're working with consumer reports, you're not just submitting privacy requests on your behalf, but you're contributing to a much larger effort to getting privacy legislation passed.
on state levels, federally, and supporting also all kinds of legislative efforts in terms of supporting the consumer. We actually recently were a huge force in getting the Sturdy Act passed that prevents furniture tipovers for child safety. So it's not just privacy that Consumer Reports works in, but it's one of the spaces that we work on for consumer protection. So to me, that's the biggest difference is we're not a for-profit company. We really are about the mission and trying to make privacy rights accessible and then
Part of that is working on all fronts to increase consumer protection. So that's to me the biggest difference. And then of course, we are sending those requests to a very similar set of data brokers. Yeah. Well, somebody who has to yell at a four year old.
I appreciate the help of the Sturdy Act. So speaking of other legislation, are there holes in existing legislation that you would look to get plugged? I know there's obviously the GDPR and the California one, but it's, you know, the Internet is international. Is there any way both to like plug those international holes with local laws? And are there local laws that aren't strong enough?
Yes, so few things. The first barrier, it would just be having legislation in the sense that California has some and Colorado and Connecticut in a few states, but there's still a lot of states that don't protect
Digital privacy in any way. So we would of course love to see more of that. And consumer reports did just put out a model bill. So I can link that after if followers are interested in seeing really specifically the types of things that our advocacy team is working on when it comes to digital privacy. There are different levels of protections even when privacy bills
exist. So one thing that we haven't talked about, for example, just as one is the authorized agent provision. So the way that permission slip is able to send requests on your behalf is because the CCPA has a provision that says you can delegate an authorized agent to do this for you. And so you sign something that gives us the ability to send these requests on your behalf as your official authorized agent. There are states with privacy laws that don't have this authorized
and of course there's all kinds of parties that don't want this authorized agent provision to exist because it makes it that much easier for you to get these requests sent out in a way that is legally defensible and enforceable. So absolutely there's all kinds of different flavors of these laws that can help inter-digital privacy and I think it's important to consider right not just that your rights are protected but that
things like the supporting provisions to ensure that they're accessible and easy to use are also in those pieces of legislation. And yeah, I'll definitely send along that model bill too, because it gets into a lot more specifics.
One question I want to ask just for the folks who are listening, is there anything interesting that you'd like to note about the tech stack that you use to build this or, you know, about the way that you maintain and update it as this, you know, legislation and other things evolve? Yes. One thing I want to say is that on the back end of this product, we have a pretty complex system running of human power and automation and all that good stuff. So we have automation to help process your
requests as they come in into the inbox. We have different tools to help make it more efficient for actual people to fill out forms on your behalf. And we are leveraging LLMs to summarize privacy policies. We don't run any user data through those. So I know that as a privacy team, that would be something that we're, of course, sensitive to, but we are running corporate privacy policies to them to make them more digestible, to make it really easy to understand. And in parallel to all of this, we are working on the data rights protocol. So that is a
Protocol, which we are developing to help standardize the delivery and receipt of these privacy requests so that hopefully they can be fulfilled via API and it's easier to get compliance at scale quickly with these requests. So part of the challenge right now is that it's this patchwork of sending emails and filling out web forms and in some case sending an email in order to get a link to a web form and in some cases getting
record on their site to put into a really buggy web form with 100 pop-ups. And so in parallel, we are working on the data rates protocol to sort of try and sidestep all of that with structured data rights requests that we're hoping is extensible to all kinds of
different legal regimes, so there's quite a lot of technical work happening. But I do, one thing I'll say, just maybe this is coming from my HCI background, but I think there has been so much value in developing this protocol in parallel to the usability experiments that we've been doing. The first version of Permission Slip was
a Tinder-like deck of cards of companies. We applied this dating app interface to legal privacy rights and just in terms of actually getting something engaging and compelling to users and saying, let's make this fun even while working.
on technical standardization. It's been such a good feedback loop. We've learned so much about how to make it engaging and have totally busted the myth that people don't care about their privacy. I think they do. I think it's just really hard to take action on it. And so ensuring that we make it easy to take action and engaging is as important as getting the tech right. So this is swipe right. If I want this company to expunge my data from this company, swipe left. It was. I feel like they have a good relationship, but they can keep my data. OK. That's how it started. Yeah.
All right, everybody, it is that time of the show. We want to shout out someone who came on Stack Overflow, helped to share a little bit of knowledge or curiosity and in doing so, benefit the whole community. A populist badge awarded just yesterday to Martin Peters, who's got over a million rep. Oh, helped so many folks. And the question is, what is the big O of the following if statement? This is about runtime of Python, if sub string is in a string. Appreciate the help.
30,000 people with your answer to this question. So thanks so much. As always, I am Ben Popper. I'm the director of content here at Stack Overflow. If I'm on exit, Ben Popper, you have questions or suggestions, email us, podcast at Stack Overflow. You want to come on the show or you want to hear us talk about something, let us know. And if you like today's program, why don't you do me a favor and subscribe so you can hear more in the future. I'm Ryan Donovan. I added the blog here at Stack Overflow. You can find it at stackoverflow.blog.
And if you want to reach out to me, you can find me at LinkedIn. I'm Suki Gladi Gilbert. I'm the senior product manager working on Permission Slip by Consumer Reports. You can learn more about Permission Slip at PermissionSlipCR.com and download it on both iOS and Android. Awesome. All right, everybody. We will put those links in the show notes. Thanks for listening, and we will talk to you soon.