Adam Schrader could have seen this coming.
The 26-year-old former Facebook employee was anything but shocked when the social network entered meltdown mode over being infested with fake news during the presidential election. Schrader had been under the hood. A former member of its now-defunct trending-news team, he monitored a feed of stories gaining traction on Facebook, vetted them for accuracy (or at least truth) and wrote a headline for Facebook’s public trending-news feed.
But a funny thing happened in August: Facebook fired its human trending-news curators and replaced them with an algorithm. Almost instantly, the social network was awash in false news stories that many users were treating as credible and sharing on their timelines. The 2016 election, polarizing as it was, fed the fake-news beast.
“Facebook has a fake-news problem, and I don’t believe that they recognize it,” said Schrader. “I think they’re in denial of the fact, but it’s a pervasive problem and they need to address it.”
With the dust still settling around the election — and many voters feeling burned by misinformation — the social network has responded by cutting publishers of false news reports off from its display ad network. And yet, at the same time, CEO Mark Zuckerberg has expressed reluctance in having Facebook play a more active role in weeding out fake news stories, saying that Facebook must be “extremely cautious about becoming arbiters of truth ourselves.”
For our latest Digiday Confessions, we talked with Schrader, who worked at Facebook for four months until August 2016, and another member of the defunct trending-news team who worked on the team in 2015 and chose to remain anonymous.
Over the course of the year, false stories such as Megyn Kelly being fired from Fox News and the pope endorsing Donald Trump made rounds on Facebook, where users generally assumed them to be accurate.
Schrader said that the fact that their trending section put out fake stories from disreputable publications on multiple occasions was “concerning.”
“It’s incredible how much fake shit was pushed out this year and propagated on Facebook without any vetting,” the second source said. “It’s extremely disappointing; it’s like this petri dish of bullshit.”
Both recalled routine instances of fake stories surfacing in the trending tool for them to review and write headlines and summaries for. Once fact-checked, it would be their job to suppress it. There was once a video of an eagle snatching up a kid and flying away with it that was trending, for example, that later turned out to be false.
“It happened almost daily,” said Schrader.
‘Journalists are a great safeguard’
Both said that while it is understandable that Facebook may want to be careful about not appearing partisan, its sheer size and influence necessitates that it take the problem seriously.
“There’s this Silicon Valley ‘free market’ mindset, where they don’t want to be nannies to their users,” said the anonymous source. “But they have 1.8 billion users, and a lot of those people use their site to get their news — and it can be extremely harmful to the way some people think if it is full of such content.”
Both felt that the journalists who made up part of the company’s former trending-news team served a very important function and that the problem has gotten worse since the team was disbanded.
“The trending-news team was definitely the barrier that kept the fake news from trending, and that’s because we had the ability to fact-check in a way that an algorithm can’t and the news judgment to know what is real and what isn’t,” said Schrader.
“I think journalists were a great safeguard against fake news,” said the second source. “From a tech standpoint, it was probably pretty inefficient since it takes time to summarize and fact-check, but at least they were monitoring stories and nixing inaccurate ones.”
‘A media company in denial’
According to a survey by the Pew Research Center and the John S. and James L. Knight Foundation, 44 percent of U.S. adults said that they got news from Facebook in 2016. That’s greater than other news-focused social media sites like Twitter (9 percent) or Reddit (2 percent), and also other traditional media organizations.
“Facebook is totally a media company, and they’ve totally been in denial this whole time,” said Schrader. “Just because you don’t identify as a media company, doesn’t mean that you aren’t one.”
Further, so many of its products are catered to journalists; just take Facebook Live, for instance. “It is a news tool; they are paying journalists to create content,” he said. “That is a news publisher.”
“The fact that they have the trending section and allow publishers to put out posts puts them squarely in that region of a news company,” said the second source. “If they didn’t have that, then, yes, they can’t censor their users for sharing false information. But when you start featuring content on the trending tab, that’s when you need to take some responsibility.”
For Schrader, the correct response by Facebook would be, first and foremost, to acknowledge, that there is a fake-news problem. And then, it could work with news publishers to find a way to combat it — perhaps by coming up with some sort of a review tool for news before it’s published or by letting consumers report articles for being factually incorrect.
“They need to pull senior editors from major publications to create a newsroom, and they need to have them overseeing the different products: news feed, trending news, events, any product that has an editorial function,” he said.