The 2020 election has stirred up a new debate about Facebook’s role in political advertising.
This past weekend, Democratic presidential candidate Elizabeth Warren revealed that she had paid for a new ad on the platform with misinformation in it, namely that Facebook CEO Mark Zuckerberg has endorsed President Trump in his bid for re-election.
Warren’s ad follows a decision from Facebook earlier this month; the company declined to remove an ad from the Trump campaign that featured an unproven claim that Democratic presidential nominee and former vice president Joe Biden had a hand in the removal of a Ukrainian prosecutor. While Biden put in a request for the ad to be taken down, Facebook declined to do so and said that the ad was in line with its policies. In September, the company said it would not be fact-checking political speech of advertising by politicians.
Essentially, Warren is now using a Facebook ad to litigate Facebook’s ads policy in public. By running a test ad to question Facebook’s own policies, Warren has spurred a conversation on platforms’ responsibility in moderating and reviewing advertising content. Instead of simply pointing out Facebook’s policy change, Warren is using a paid ad to show people how political advertising now works on the platform and how Facebook’s decisions don’t just have an impact on the platform but can have an impact on political advertising at large. By allowing political advertising with misleading or false speech in them, Facebook could potentially “move the needle and move the frame of acceptable speech toward messaging that is misleading or, in fact, false,” said Michael Horn, chief data officer at Huge.
At the same time, this new debate calls into question, once again, how Facebook should be defined. That debate over whether or not Facebook is a media company or a technology company has played out time and again over the course of the company’s history.
Facebook did not immediately respond to a request for comment.
Over the past 30 days, Warren has been mentioned nearly 4 million times on Twitter, according to Brandwatch, which found that part of that conversation has been around Warren’s Facebook stance. There have been 217,000 mentions of Facebook within Warren’s discussion, according to the company, which found that 48,000 of those accumulated on Oct. 12, which is when Warren posted about her new ad.
While political ads are generally understood to function in a different realm than brands’ advertising, consumers may no longer buy the argument that platforms are not responsible for the content posted on their platforms, especially when it comes to advertising.
“We sort of assume that the people who control social media have some accountability into the credibility and trustworthiness of the content,” said Barry Lowenthal, president of The Media Kitchen. “But we realize now that that’s not true. [The platforms are] not taking that accountability, so there’s this crisis of trust.”
Of course, trust in advertising has been questioned repeatedly in recent years, especially after the Cambridge Analytica scandal following the 2016 election, which made clear how consumer data could be misused in elections. While that shed a light not only on how much data platforms have on consumers but more generally how that data could be used for ad targeting — a concept that consumers have come to understand more over the past few years — consumer behavior didn’t necessarily change. That will likely continue to be the case this time, according to media buyers, who believe that while this recent kerfuffle between Warren and Facebook will drive conversation it won’t change consumer behavior.
“There is increased awareness [around ad targeting and data,] but that awareness isn’t the same as motivation to do anything about it,” said Horn. “That has not translated to action in a way that you might expect a consumer responding to what’s seen as a more overt abuse of trust. Just knowing that you’re being targeted based on a set of behaviors or demographics is not the same as saying, ‘I’m going to leave Facebook or not use this device.’”
That lack of change in consumer behavior even as those consumers understand how their data is used to serve them specific ads is likely due to the difficulty of opting out. As more and more of everyday conversation happens on digital platforms, it’s harder for consumers to decide not to use those platforms. At the same time, reining in the amount of information a platform has by changing privacy preferences is time-consuming and there isn’t a standard set yet. That said, regulations like GDPR and the upcoming implementation of CCPA are meant to establish a standard around data privacy. But without that standard set yet, consumers are generally aware of and fine with their data being used for ad targeting, according to media buyers.
“If your only option is to opt-out entirely, that’s something very few consumers are willing to do,” said Horn.
The new conversation “highlights an important issue, but if history is any indication of how this will play out, not much will come of the back and forth,” said Carrie Dino, media director at Mekanism. “I would be surprised if any sweeping changes resulted from this particular exchange, but agencies have to be hyper-vigilant in staying on top of targeting laws, best practices and changes to [Facebook] targeting capabilities.”
Consumers’ knowledge of ad targeting — at least, on a cursory level — hasn’t yet changed their behavior in a substantial way on platforms like Facebook. But with this upcoming election, consumers may keep a closer eye on how platforms use that data and how that data is used by candidates. Still, even as consumers believe they understand ad targeting they may not understand the nuance of it.
“[This] is about understanding the role that the algorithm plays on a biddable ads platform,” said Noah Mallin, head of experience at Wavemaker. “The Warren campaign’s point was that misleading and sensational claims amplify all the user responses that the algorithm sees as “good,” allowing the ad to reach more people at the same budget. I’m not sure if people understand that nuance, but it is good to have a civic conversation about the role of large information platforms in public life and in that way this feels like more of a continuation of a conversation that has been happening since at least the 2016 election.”
Still, the recent back and forth between Warren and Facebook is not just a rehash of the 2016 election issues on Facebook. Given the focus on the content of the ads and truth, it may push the conversation of the content of those ads further and platforms’ responsibility to moderate it, especially when consumers start to compare ads on digital platforms to those run on television in terms of truth.
“For cause-based and political advertising, it will hopefully force the platforms to adhere to something closer to the FCC’s Truth in Advertising rules and regulations,” said Rory O’Flaherty head of media at Mekanism.
This question of the platforms’ role in content moderation has not changed clients’ Facebook budgets, say buyers.
“The issue Facebook is having is that they’re peddling in untrue content, so therefore no one is going to believe that any of this is real,” said Lowenthal. “Clients look at all different social channels in totality. A lot of clients are optimizing for performance whether conversion or engagement. Our clients are not peddling in untrustworthy content. It’s very, very different.”