It may have been Facebook’s role in the 2016 U.S. presidential election and misuse of its users’ information that led to Mark Zuckerberg’s appearance before Congress this week, but the two days of hearings have gone far beyond that and turned into something of a referendum on the ills of Facebook and the tech industry in general.

The tech giant’s questionable ability to fully understand and control the breadth of its user data was on full display at the hearings. On day two of questioning by Congress, the Facebook CEO fielded questions on everything from a lack of racial diversity in Facebook’s ranks to its impact on what news people see and its broader role in society.

Rep. G.K. Butterfield, D-N.C., held up photos of Facebook’s all-white leadership team, saying, “This does not reflect America.” He also asked if the company would hire more black employees and for retention data on employee diversity, pointing to an issue that’s plagued Silicon Valley generally.

Zuckerberg said Facebook was “focused on” the issue. “I think we know that the industry is behind on this,” he responded.

Zuckerberg also faced questions about Facebook’s role in society. Rep. Anna Eshoo, a Democrat from Facebook’s home district in California, dealt him a line of tough questioning, asking: “Do you have a moral responsibility to run a platform that protects democracy?” (Response: “Yes.”)

At another point, Zuckerberg was asked about experiments Facebook did to determine how using the social media platform affects people’s mood. “We felt we had a responsibility to understand that,” he said. “We want our services to be good for people’s well-being.” Based on finding that people are happier when they use the platform actively, he said, Facebook has shifted its product to promote user engagement, to the dismay of publishers that now find their content increasingly deprioritized in the news feed.

Zuckerberg fielded repeated questions about whether Facebook has a bias against conservative content, a charge he denied.

From these and other questions, a theme emerged: Facebook has gotten so big it can’t police itself, not just when it comes to keeping user data safe, but around all the activity on its platform. To listen to the hearings, it sounded at points like Facebook had become a quasi-state; at one point, a member of Congress asked Zuckerberg how Facebook could help protect people from terrorism.

Zuckerberg said again and again that people have control over their data on Facebook. But Rep. Kathy Castor, a Democrat from Florida, said Facebook settings aren’t enough to protect people and the law hasn’t kept up to adequately protect people, calling for Congress to act. “That’s the business you’re in, gathering and aggregating data,” she said.

At another point, Rep. David McKinley, a Republican from West Virginia, pummeled Zuckerberg with questions about the appearance of ads for opioids on Facebook. “Your platform is still being used to circumvent the law,” he said. “You are hurting people.”

It’s unclear how well Zuckerberg did to sell the virtues of the tech industry. He was poised and polite, but stuck to his talking points, repeatedly falling back on the line that Facebook depends on users to flag objectionable content on the platform. He said Facebook’s own staff isn’t big enough to monitor the billions of pieces of content that runs through it, and the company needs to develop artificial intelligence tools to attack the problem. In the case of the opioids ads, which touch on a very human problem, the answer was a very mechanical one.

  • LinkedIn Icon