In defense of Facebook algorithms and programmable news

by Pat McCarthy, SVP Marketing, AppNexus

A lot of fingers have been pointed at Facebook recently, and not all of them were headed for the “Like” button.

Allegations have surfaced that Facebook’s in-house team of “news curators” routinely suppress “trending” political stories that express conservative viewpoints. Though stories on its trending topics module are selected algorithmically, the company has acknowledged that live human beings manage the process, an admission that has raised concerns that the social media giant uses its outsized voice to impose a liberal bias. Facebook sharply denies the claim, and it recently took steps to assure conservative media figures of its impartiality.

I suspect that the controversy over Facebook’s trending news stories is less about one company’s alleged bias, and more about popular confusion around the customization and personalization of the internet. Ten years ago, most people still received political news and commentary – indeed, all digital content – by visiting static websites or opening static apps.

Today, powerful algorithms draw data from the programmable internet of interconnected devices and applications and determine what pops up on our screens. On the surface, that means we are surrendering a great deal of agency to the people who design and manage those algorithms. It’s a relatively new state of affairs, and it can seem scary.

Indeed, there are two knocks against what we could call “programmable news” – or, news that is curated by algorithms and the people who manage them. First, we may be forfeiting human discovery and independent thought by reading only what the machine allows us to see. Second, by turning our political and news content over to algorithms that predict what we want to read, we may be siloing ourselves based on ideology and identity. These silos, the argument goes, limit meaningful debate.

But historical context is important. For starters, it doesn’t take a programmable algorithm to limit human discovery. Whether you watch partisan cable networks like Fox and MSNBC or read less biased sources like the Economist or Associated Press, you are empowering third-party editorial staff to determine what news and commentary you see. We don’t often see the U.S. Senate threatening to drag network producers before their committees to answer questions about their alleged bias. And indeed, it shouldn’t. There is such a thing as freedom of press.

Furthermore, it doesn’t take an algorithm to create silos. From the earliest days of the American republic, citizens read the outlets that most aligned with their ideology and ignored those that didn’t. Supporters of George Washington’s administration read the Gazette of the United States. Opponents read the National Gazette.

By the mid-nineteenth century, 95 percent of American newspapers were affiliated with one of the major political parties. The programmable internet didn’t create these silos. If anything, it’s the one instrument that might break them down.

Here is the point: algorithms aren’t supposed to be “unbiased.” They aren’t supposed to replace human intuition, judgment, and knowledge. They are an instrument of machine learning: they exist to take human intuition and apply it at scale – enormous scale – the kind of scale that allows the programmable internet to customize the experience of a billion internet users at any time. A human being – or a team of several dozen human beings (like Facebook’s news “curators”) – can’t produce this unique experience at scale, but they can build, adjust, and manage algorithms that can.

To draw on another example, in my industry – digital advertising – algorithms aren’t going to replace creativity, research, and marketing expertise. They increasingly allow us to scale those human qualities to deliver more relevant, diverse, and customizable messages to consumers.

Is there room for partisan influence in “programmable news?” Of course there is. Is this bias more powerful than the influence we already allow editors and producers to exercise? Probably not.

If anything, algorithms may help break down silos and broaden our exposure. Which is to say, machine learning may introduce the reader who normally tosses out the sports page to try a recommended article on professional basketball, or the sports fanatic who never opens the Sunday book review to read an excerpt of Franklin Foer’s book on soccer and its political meaning.

That’s not to say that Facebook shouldn’t be careful. It’s not just a distribution channel. It’s a media company – one with 1.65 billion active users each month. With that reach comes responsibility.

“Programmable news” is here to stay, every bit as much as the programmable internet is now a fixture of everyday life. The question isn’t whether human beings are losing their agency, but – as always – how they choose to exert it.

https://digiday.com/?p=181916

More from Digiday

Why the New York Times is forging connections with gamers as it diversifies its audience

The New York Times is not becoming a gaming company. But as it continues to diversify its editorial offerings for the digital era, the Times has embraced puzzle gamers as one of its core captive audiences, and it is taking ample advantage of its advantageous positioning in the space in 2024.

Why B2B marketers are advertising more like consumer brands to break through a crowded marketplace

Today’s marketing landscape is more fragmented than ever. Like consumer brands, business brands are looking to stand out in a crowded and competitive marketplace, making marketing tactics like streaming ads, influencers and humorous spots more appealing.

As draft puts WNBA in spotlight, the NBA is speeding up ballplayers’ transition to creators

The NBA’s star athletes are its greatest marketing asset.