The Federal Trade Commission yesterday began the next step in its process of gathering information that will help inform whether to propose new privacy regulations for how companies can collect, use and share consumer data for business purposes.
During the federal agency’s first public hearing related to “commercial surveillance and data security,” experts and members of the public addressed concerns about data privacy while also suggesting where they thought regulation could be used and the potential impacts on businesses and consumers.
You have read the maximum number of free articles.
This content is available exclusively to Digiday+ members.
The forum — which follows the FTC’s Advanced Notice of Proposed Rulemaking published last month — is part of a lengthy process the FTC will need to adhere to before approving any potential rules. By gathering information, the agency is asking for public input on dozens of questions to inform any potential new data privacy rules regardless of whether a new national law is passed.
It also comes as U.S. lawmakers consider federal legislation around data privacy even as the FTC flexes its muscles through ongoing and settled lawsuits. (Last week, the agency sued the Idaho-based data broker Kochava over its geo-location practices.)
Consumer privacy concerns
Consumer and industry experts expressed concerns about how data is used to discriminate against consumers based on race, income, gender, age and other personal information. According to consumer advocates, that could affect how people are approved for everything from mortgages and credit cards to how they can compete for jobs. Spencer Overton, president of the Joint Center for Political and Economic Studies, cited a June settlement between Meta and the U.S. Justice Department about a housing discrimination case.
“Ads for employment opportunities can be steered toward male users and away from women,” Overton said. “Ads for new housing can be steered toward white users, and away from Black and Latin X users.”
New privacy laws are needed to address newer formats like email and online search said Caitriona Fitzgerald, deputy director of the Electronic Privacy Information Center (EPIC), who said Google’s use of tracking email and search history for ad-targeting would be illegal if applied to older communication formats like mail and telephone calls. Karen Kornbluh, director of Digital Innovation and Democracy Initiative at the German Marshall Fund of the U.S., said the overturning of Roe v. Wade “clarified for many” how vulnerable online activities render their private lives after recent revelations of how tech can track and identify people who search online for and visit abortion clinics. Kornbluh — a former U.S. ambassador — also pointed out that there’s also a “national security loophole” without laws banning foreign governments from buying sensitive data about American consumers.
“The current consent framework is insufficient,” Kornbluh said. “The companies that we as users deal with online have an asymmetry of information, yet no obligation doctors or lawyers that have strong professional ethical constraints and legal obligations to act in the user’s interests with the extensive profiles they have.”
Although retailers use data to maintain relationships with consumers and compete in their industry, Paul Martino, vp and senior policy counsel for the National Retail Foundation, said maintaining the brand’s reputation gives them the incentive to have stronger data practices in place. However, he said third-party data brokers that are less known by consumers can be a “much greater risk” because consumers don’t know what to expect from them or what data they have access to.
“Third-party businesses lack the incentives of customer serving businesses to use data responsibly and in alignment with consumers’ interest because they are not in pursuit of long-term customer relationships with the consumers whose data they collect and process,” he said.
How to improve data privacy
Panelists also made a number of suggestions for what they’ve seen companies and laws address for responsibly using data. For example, Jason Kint, CEO of Digital Content Next, said new state laws giving consumers ways to notify businesses of their privacy preferences can be helpful. But Martino cautioned that making regulations too broad could also block data-driven relationships with companies that consumers might actually want.
Rebecca Finlay, CEO of Partnership on AI, said companies that create processes for documenting data collection and usage — such as for machine learning — can help measure the impact systems have. According to Finlay, that creates a “foundation” for accountability and transparency inside and outside of the organization.
“This is not just about creating a checklist of characteristics, or even potential sort of mathematical or technical models,” she said. “This is really about creating management systems and processes that stretch right from the design, development and deployment of the machine learning system being considered.”
And there should be financial penalties for companies that break the rules, said Marshall Erwin, chief security officer at Mozilla, adding that it’s “fundamentally what moves the needle in a meaningful way.” However, requiring everyone to play by the same rules is also key, according to Kint, who added that companies with dominance should adhere to even higher standards.
“The digital ad market is a little bit like a water balloon,” Kint said. “If any individual actor — a retail site or publisher — moves forward with a higher level of standard, the advertising market will just shift to where they can find those users and target them.”
Sign up to get the day’s top stories at 6am eastern.