California State Senator Josh Becker on privacy, data brokers and why he’s sponsoring the Delete Act

A proposal to further regulate data brokers in California is getting even closer to becoming state law.

Last week, the Delete Act passed the state’s Assembly Appropriations Committee — with 10 Democrats voting in favor and three Republicans voting against — moving it on to a full vote by the California Assembly later this month. 

Sponsored by State Senator Josh Becker, (D-San Mateo), Senate Bill 362 has been praised by privacy advocates but attacked by the advertising industry. Along with requiring more transparency from data brokers, the bill would also give consumers an easier way to get data brokers to delete their personal data, create new enforcement measures and impose new fines on the companies that violate then.

In an interview with Digiday, Becker discussed the legislation, why he’s concerned with data brokers and what he thinks of recent pushback from the ad-tech world. (This is the second time Becker has sponsored a data broker bill, following similar legislation he introduced last year.)

“Rights that are too complicated to use are not really rights,” Becker said. “We’re really just trying to make it possible. And by the way, I’ve talked to some folks in the data space who say, ‘We don’t want unhappy customers.’ Great, OK, then let’s make it easy for people to move their data.”

Nearly 500 data brokers are already registered in California, but some estimate the state’s total is twice as high. The Delete Act — which is partially modeled after the National Do Not Call Registry — would help increase data broker accountability and transparency while also giving “people knowledge about what’s out there on them and whether they think that’s a good thing for society or not,” Becker said.

This interview has been edited for length and clarity.

How did you get interested in data privacy in the first place? Obviously, there’s a lot of tech companies in your district.

Yeah, there’s a lot of tech happening — my district has been a great driver of innovation in the economy. But it’s also my responsibility to take a skeptical eye as well and to look at countering any harms that are created. That led me to looking at data brokers and the amount of information they collect on each one of us. And then concerning that information, is that information secure, for example, from data breaches? Where’s your information being sold to? What information is available about each one of us?

There are concerns we’ve seen about leaks of sensitive information about people that are in mental health websites or in substance abuse, alcohol and counseling websites, then getting sold to data brokers. We really need to be thinking about the risks to all of us of having this information floating out there and thinking of ways to get people control back over that information.

If people want to disclose it, that’s one thing. But the problem is, by definition data brokers [are companies] we do not have a relationship with.

Privacy advocates appreciate that this bill would bring data brokers more into the light. How has being based in Silicon Valley helped shape your understanding of ad-tech?

I also worked in technology for many years so I’ve had a better sense [already]. But I remember way back in 1995 when I was helping build websites and we went to the American Association of Political Consultants Conference and someone told me about all the data points they have on people across the country and how they were going to use that to pinpoint for political advertising. This was a shock way back then to hear how much information these parties had on each one of us. I’ve been sensitized to it since then.

Recently, I ran a dossier just on me to see what was available. And it’s disturbing to see how much information is out there on each one of us that some people just don’t know about. Part of the goal [with SB362] is to just bring it out of the shadows. We’ll be able to let people, if they want to, go through and say, “Hey, this is a trusted source” or “OK, I see these guys are using my information for health records or other things, that’s fine.” … Right now, with the current [law’s] registry, we don’t allow people to see what kind of information [data brokers] collect about you. But our bill does that. We say you have to show what kinds of information you collect so people can then say, “Hey, if I don’t feel comfortable with that, I’m gonna delete it.” And if they want to do the global delete, they can do that, too.

What kind of information did you learn from your own dossier?

It feels weird even talking about it, but I’m looking at it right now, and it’s quite extensive. You could run this on any one of us and I think people would be really surprised about what’s out there. [For example] people who were my roommates 20-plus years ago. If somebody wanted to use this as a scam to impersonate somebody to scam me, there’s lots of information they could use that would make me think it’s a trusted source … It’s got email addresses, it’s got addresses I lived in all the way back to when I was in Washington, D.C., in the ’90s at a place I lived in for three months over the summer. It’s quite extensive.

With all the concerns around data brokers, how will this bill expand accountability? 

That’s the important part of this: We have auditing ability here and real substantial fines for non-compliance. That’s why the opposition is fighting so hard on this.

It’s interesting to see how this bill has been getting a lot of pushback. Have you been surprised?

Yes and no. I had a data broker bill last year that got killed so I knew there was going to be a fight on this one. There are some folks who’ve worked with us and said, “We don’t want unhappy customers and we want to make this easy,” but also people who have not taken that attitude.

I understand some recent changes since the last vote seek to appease some opponents. What are those? 

Some of the data brokers have an important role to play, in preventing fraud, enabling you to see the history of a car if it’s been in accidents, and all that sort of stuff. So there are certain things we want to make sure that, with legitimate uses, people could keep those kinds of information.

There are legitimate data brokers who have tight security, but we have to find a way to get this information under control. The scams are getting more and more and more sophisticated. And the more information about that’s out there, the more risk there is.

Will this bill protect consumers from having their data used to train AI models? Since it’s been such a big topic this year, I was surprised not to see any language addressing that, especially since CPRA was passed before the current generative AI boom began.

This bill does not address that. I do have a separate AI bill that would create a comprehensive working group on this and really start to delve in detail into the issues. We just had a roundtable yesterday [late August] on this. So yeah, I’m certainly diving into AI applications, but for me that’s a separate effort. I hadn’t connected the two.

https://digiday.com/?p=516917

More in Marketing

robot drawing on paper. technology for programmatic advertising

WTF is ad tech curation?

Done right, curation is a win-win: more efficient reach for advertisers and a revenue bump for publishers.

Creatives urge marketers to resist swing toward ‘conservative’ post-election ad messaging

Agency strategists and cultural experts told Digiday they expect some marketers to turn towards more “conservative” messaging.

Sauce brand Rao’s, under Campbell’s, makes a play for a national audience with a beefed-up budget

With backing from Campbell’s, Rao’s is spending more to show up in live sports and events with the hope of reaching a national audience.