The Federal Trade Commission is punching right at the heart — and guts — of how data collection drives revenue for tech firms: their algorithms.
“I anticipate pushing for remedies that really get at the heart of the problem and the incentives that companies face that lead them into the illegal conduct,” FTC commissioner Rebecca Slaughter told Digiday in an interview last week.
Slaughter pointed to two cases that reflect what we might see more of from the agency. When the FTC in May settled its case against Everalbum, maker of a now-defunct mobile photo app called Ever that allegedly used facial recognition without getting people’s consent, the agreement featured a new type of requirement that addresses the realities of how today’s technologies are built, how they work and how they make money. Along with requiring the firm to obtain express consent from people before applying facial recognition to their photos and videos and to delete photos and videos from people who had deactivated their accounts, the FTC told Everalbum there was another “novel remedy” it must abide by: it would have to delete the models and algorithms it developed using the photos and videos uploaded by people who used its app.
Put simply, machine-learning algorithms are developed and refined by feeding them large amounts of data they learn and improve from, and the algorithms become the product of that data, their functions being a legacy of the information they consumed. Therefore, in order to make a clean sweep of the data that a company collected illicitly, it would also have to wipe out the algorithms that have ingested that data.
Cambridge Analytica case laid groundwork for algorithmic destruction
The Everalbum case wasn’t the first time the FTC had demanded a company delete its algorithms. In fact, in its final 2019 order against Cambridge Analytica, alleging that the now-infamous political data firm had misrepresented how it would use information it gathered through a Facebook app, the company was required to delete or destroy the data itself as well as “any information or work product, including any algorithms or equations, that originated, in whole or in part, from this Covered Information.”
Requiring Cambridge Analytica to delete its algorithms “was an important part of the outcome for me in that case, and I think it will continue to be important as we look at why are companies collecting data that they shouldn’t be collecting, how can we address those incentives, not just the surface-level practice that’s problematic,” Slaughter told Digiday.
The approach is a sign of what companies in the crosshairs of a potentially more-aggressive FTC could have in store. Slaughter said the requirement for Cambridge Analytica to kill its algorithms “lays the groundwork for similarly employing creative solutions or appropriate solutions rather than cookie-cutter solutions to questions in novel digital markets.”
Correcting the Facebook and Google course
It’s not just Slaughter who sees algorithm destruction as an important penalty for alleged data abuse. In a statement published in January on the Everalbum case, FTC commissioner Rohit Chopra called the demand for Everalbum to delete its facial recognition algorithm and other tech “an important course correction.” While the agency’s previous settlements with Facebook and Google-owned YouTube did not require those firms to destroy algorithms built from illegally-attained data, the remedy applied in the Everalbum case forced the firm to “forfeit the fruits of its deception,” wrote Chopra, to whom the FTC’s new reform-minded chair Lina Khan formerly served as legal advisor.
Slaughter’s stance on forcing companies to kill their algorithms, also addressed in February in public remarks, has caught the attention of lawyers working for tech clients. “Slaughter’s remarks may portend an active FTC that takes an aggressive stance related to technologies using AI and machine learning,” wrote Kate Berry, a member of law firm Davis Wright Tremaine’s technology, communications, privacy, and security group. “We expect the FTC will consider issuing civil investigative demands on these issues in the coming months and years.”
Lawyers from Orrick, Herrington and Sutcliffe backed up Berry’s analysis. In the law firm’s own assessment of Slaughter’s remarks, they said that companies developing artificial intelligence or machine-learning technologies should consider providing people with proper notice regarding how their data is processed. “Algorithmic disgorgement is here to stay in the FTC’s arsenal of enforcement mechanisms,” the lawyers said.
Gannett reviews employee blowback to social media policy memo after Roe overturn
After receiving criticism for forbidding its journalists from posting opinions on the Supreme Court striking down Roe last week, Gannett is reviewing employee perspectives.
Companies turn to employee resource groups to manage internal discourse around the abortion ruling
Companies are using ERGs to facilitate employee conversations, as well as executive leadership via companywide emails to employees stressing their support for wellbeing and the availability of managers for support.
Member ExclusiveMedia Briefing: The pros and cons of three commerce pricing models
In this week’s Media Briefing, media editor Kayleigh Barber breaks down the different pricing models that commerce publishers use.
SponsoredWhy the caliber of content is paramount for advertisers
Agata Brodniewska, brand safety manager, Dailymotion Content is king when attracting consumers but is equally essential when courting advertisers. While both stakeholders want many of the same things, they most notably want relevant content they can count on to deliver an accurate and honest message without confusion or misinformation. This is especially important for advertisers […]
Bloomberg Green’s expansion increases its service-oriented coverage
Bloomberg's climate vertical is adding new products and coverage areas to lean into solutions-oriented journalism.
Vice Media Group brings back program for small, Black-owned businesses
VMG and the National Urban League are bringing back their program offering marketing and consulting services to Black-owned businesses -- to a smaller group.