Princeton researchers ditch Facebook political ad project after the platform used a debunked FTC privacy defense
When policymakers want to understand how political ad targeting affects elections, they look to academic researchers, and those researchers look to one of the most important platforms that sells those ads: Facebook.
Now, Princeton University researchers who applied months ago to access Facebook-sanctioned political ad data have pulled the plug on that would-be project, blaming the platform’s rigid contractual requirements and its now-debunked claim that the company’s settlement with the Federal Trade Commission prohibited Facebook from negotiating its terms of data access. The researchers’ decision comes on the heels of Facebook’s shutdown of another high-profile political ad research project at New York University.
For Princeton researchers including Orestis Papakyriakopoulos, a Ph.D. at the University’s Center for Information Technology Policy, the key sticking point was a contract Facebook requires research institutions to sign before accessing its data. In particular, he and others on his digital tech policy research team were concerned that agreeing to the contract would give Facebook the right to remove information from their research findings had they actually went through with the project.
“It doesn’t make sense for us to do research for six months and then not be able to publish it,” Papakyriakopoulos told Digiday.
The Princeton researchers and the school’s lawyers were concerned that, if the research findings revealed how Facebook’s ad targeting technology and tools worked or how the company’s system determined ad prices, the contract would give the company the right to remove those findings from research before publication. “We sought to clarify whether Facebook would assert that information about how the Facebook advertising platform was used to target political ads in the 2020 elections is ‘Confidential Information’ that the agreement would allow them to ‘remove’ from our publication,” wrote the researcher team in an August 5 post published on the center’s blog.
The contract Facebook requires researchers to sign to access data through its Facebook Open Research and Transparency platform, or FORT, states that research findings resulting from analysis “may not disclose any Confidential Information or any Personal Data” and gives Facebook the opportunity to review publication drafts “to identify any Confidential Information or any Personal Data that may be included or revealed in those materials and which need to be removed prior to publication or disclosure.” According to the contract, Confidential Information includes information relating to Facebook’s products and technology, its data processing systems, policies and platforms, in addition to personal information pertaining to its users or business partners.
“The questions these researchers ask and conclusions they draw are not restricted by Facebook,” a Facebook spokesperson told Digiday regarding the Princeton researchers. “We simply ask academics to sign a research data agreement to ensure no personal data or confidential information is shared. Today, hundreds of researchers at more than 100 universities have signed the agreement.”
The company said it does not approve or reject research papers. “As of now, we have not rejected any research papers as a part of our standard review process to ensure no personal data or confidential information is included,” said the Facebook spokesperson.
Facebook’s FORT data platform is an example of the company’s increasingly restrictive approach to engaging academic researchers in an environment that’s drastically changed since the 2016 Cambridge Analytica political ad targeting scandal, which involved the use of data originally derived for psychographic ad targeting from Facebook data that was scraped for academic research. Critics often refer to tech companies’ justifications for academic research data limits as “privacy-washing.”
Poking holes in Facebook’s FTC defense
It turns out that Facebook’s justification for why it would not negotiate the contract with the Princeton researchers employed an argument that has now been debunked by the FTC. Facebook told the researchers its contract was non-negotiable because the stipulations therein were mandated by Facebook’s 2019 settlement with the agency involving consumer privacy violations. “We pushed back on this ‘take-it-or-leave-it’ approach,” wrote the researchers, who added, “Facebook later conceded in a subsequent email that they were under no legal mandate and that their approach was simply based on their internal business justification.”
Facebook’s contention that its agreement with the FTC prohibits negotiations for data access by academic researchers came to the fore on Aug. 3, when the company’s product management director Mike Clark wrote that the FTC Order was justification for Facebook’s decision to disable accounts and apps associated with NYU’s Ad Observatory Project, a political ad targeting research effort that had already been under threat of shutdown by Facebook since October 2020.
“We took these actions to stop unauthorized scraping and protect people’s privacy in line with our privacy program under the FTC Order,” he wrote, noting that the NYU project’s “ongoing and continued violations of protections against scraping cannot be ignored and should be remediated.” Clark said the NYU researchers should have tapped its sanctioned FORT data instead.
In response to the Facebook post, the acting director of the FTC’s Consumer Protection Bureau, Samuel Levine, wrote, in a letter to Facebook CEO Mark Zuckerberg published on the agency’s site, that its agreement with Facebook “does not bar Facebook from creating exceptions for good-faith research in the public interest.” Notably, he added, “the FTC supports efforts to shed light on opaque business practices, especially around surveillance-based advertising.”
But even back in May, Laura Edelson, an NYU Ph.D. candidate working on that now-shuttered NYU project, told Digiday the FORT data wasn’t of interest because there were restrictions on the level of ad targeting information Facebook made available in the data set. Facebook said those limits were “one of several steps we have taken to protect users’ privacy.”
More in Marketing
What does the Omnicom-IPG deal mean for marketing pitches and reviews?
Pitch consultants predict how the potential holdco acquisition could impact media and creative reviews heading into the new year.
AdTechChat organizers manage grievances amid fallout of controversial Xmas party
Community organizers voice regret over divisive entertainment act at London-hosted industry party, which tops a list of grievances.
X tries to win back advertisers with self-reported video stats
Is X’s big bet on video real growth or just a number’s game?