Cheat sheet: U.S. lawmakers propose Section 230 reforms to regulate online paid speech
One of the internet’s primary protections is on the verge of an overhaul.
Section 230 — part of the Communications Decency Act — shields online companies, including tech platforms and publishers, from being liable for the content that others post to their sites. But, amid outcries over disinformation, hate speech and concerns about biased decisions to kick people off social media platforms, Democratic and Republican lawmakers alike have proposed changing Section 230 over the past year.
The most recent reform proposal is the SAFE TECH Act, which was introduced on Feb. 5 by Democratic Sens. Mark Warner of Virginia, Mazie Hirono of Hawaii and Amy Klobuchar of Minnesota. The bill would make platforms like Facebook and Twitter liable when paid content posted on their sites is used to target vulnerable people. But some lawyers and media executives say the SAFE TECH Act and other reform proposals would remove legal protections that helped the web thrive in the first place.
Here’s a breakdown of what’s happening with Section 230 and why it matters:
01. Lawmakers have homed in on reforming Section 230 as a catch-all that could assuage a variety of internet ills that fall under the broad umbrella of big tech problems, from disinformation and cyber-bullying to fraudulent product marketing.
02. In general, Section 230 reform proponents want more accountability for social media platforms that they say have relied on the law as a get-out-of-jail-free card for too long.
03. Lawyers and media execs worry reforming the law could have unintended consequences, put undo legal pressure on small publishers and lead to content moderation changes that remove the instantaneous quality of the social media experience.
04. President Joe Biden has signaled support for reforming Section 230, and multiple proposals for reform have been introduced in Congress.
The case to reform Section 230
The legal protections afforded by Section 230 have helped to foster the double-edged sword of online content distribution. By enabling user-generated content, it has allowed for the rise of the digital creator economy and the use of social media to draw attention to everyday injustices. But it also enabled online harassment and permitted the dissemination of disinformation and fraudulent ad messages. As a result, the law has become a focal point for lawmakers and reform advocates who want to curb what they consider to be the law’s negative effects. “Section 230 has become this lightning rod,” said Cathy Gellis, an independent San Francisco-based internet lawyer.
Republicans want to alter Section 230 to stop platforms from discriminating against conservative voices by removing accounts or censoring posts deemed to be hate speech, disinformation or other content that incites violence. Meanwhile, Democrats tend to be intent on changing Section 230 to suppress extremist speech, ad messages or disinformation that could lead to harassment or harm against vulnerable people.
“I strongly believe these platforms need to be regulated not for content, but they need to be regulated for the harms that emerge from them,” said Ellen Goodman, co-director and co-founder of the Rutgers Institute for Information Policy and Law.
Chris Pedigo, svp of government affairs at publisher trade group Digital Content Next, suggested platforms have taken advantage of Section 230. “Premium publishers go to great lengths to ensure they publish factual, quality content,” he said. “Platforms should be empowered and encouraged to do the same. Unfortunately, Section 230 has been often misused by big tech to do the absolute bare minimum and then hide behind a cloak of liability protection.”
Section 230 reforms in general would help to weed out nefarious content, which would be a good thing for advertisers, said Brian Wieser, global president of business intelligence at GroupM. “Anything a publisher can do to detoxify the platform makes more brands want to be there,” he said.
Tech leaders appear to support reforming Section 230. In October, Facebook CEO Mark Zuckerberg said, “Congress should update the law to make sure it’s working as intended.” In November, Twitter CEO Jack Dorsey similarly signaled support for Congress altering or adding to Section 230.
The latest attempt to change Section 230
The latest attempt to change the law is the SAFE TECH Act (short for Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms Act). The bill is supported by the NAACP and the Anti-Defamation League, among others, which said it would hold platforms accountable for hosting hate speech, enabling civil rights abuse or enabling voter suppression ads. Klobuchar has also made a splash with a recent antitrust reform bill that she has sold as a way to solve problems created by big tech.
The SAFE TECH law would hold websites and social media platforms or any other “interactive computer service” liable for paid speech, allowing people to sue companies hosting offending speech when it involves payment. Though originally intended to cover digital advertisements and marketplace product posts — such as posts promoting items like illegal guns — rather than organic posts or comments, the bill’s language has been criticized because it could have implications for organic third-party posts on subscription services.
“This language was developed to reach digital ads and marketplace items. Given the potential implications that this would have on subscription services as observers have noted, Sen. Warner is currently reviewing and working to refine the language,” a spokeswoman for Warner told Digiday.
Unintended consequences
Lawyers interviewed for this story, including Gellis, consider the bill’s language too broad. In particular, they take issue with the phrase that would remove liability protections for a publisher that “has accepted payment to make the speech available.” They say the bill’s breadth could have implications not just for social media platforms disseminating harmful speech through advertising but also any web operation that obtains payment to run its business. That could span Substack newsletter journalists to e-commerce sites like Etsy.
“I don’t think it’s overstating it to say [the SAFE TECH Act] destroys the internet economy,” Gellis said. She and other detractors also argue the bill’s affirmative defense provision would place extra burden on defendants because it could require them to produce documentation in their defense which can lead to costly litigation.
Changes to the social media experience
Some advertising executives and lawyers worry that removing Section 230 protections could influence social media platforms to establish new content moderation obstacles that could change the user experience for the worse.
If the SAFE TECH Act were to pass, “I have every reason to believe that your Average Joe would likely not have access to the public square the way that they’ve seen with the platforms before,” said Kieley Taylor, global head of partnerships at GroupM.
If the platforms were subject to increased liability for user-generated content, Taylor said, they may take more steps to moderate content before it is posted, delaying the instant gratification people have come to expect from posting to social platforms.
The problem for publishers
Section 230 reform could hurt small publishers that may not have the resources to counteract potential lawsuits or manage new content moderation requirements, according to Eric Goldman, a professor at Santa Clara University School of Law who focuses on tech and internet law. He suggested, for instance, that removing Section 230 protections for self-serve ad content could force small publishers to decide between establishing costly new ad moderation procedures or finding other ways to support their content.
“They should be panicked. So many digital publishers may have a self-service ad model that will not be tenable in a post-230 world,” he said. It’s not clear how reform proposals might affect sites with ads served through third-party ad exchanges.
Possible support from the Biden administration
The Biden administration might welcome Section 230 reform. President Joe Biden told The New York Times last year he wants the law revoked. His newly-confirmed Commerce Secretary Gina Raimondo said at a recent hearing on Jan. 26 that Section 230 needs reform. “We need platform noting accountability, but of course that reform would have to be balanced with the fact that these businesses rely upon user-generated content for their innovation and created many thousands of jobs,” Raimondo said.
Expect more Section 230 proposals
Several other Section 230 reform proposals that would remove various content liability protections for digital platforms were introduced last year and could find their way into the current Congress. “There will be dozens of proposals in next few months,” Goldman said.
Here are a few introduced last year that have garnered some attention and could offer a preview of what to expect from any future proposals:
- A bipartisan bill known as the PACT Act would require large online platforms to remove court-determined illegal content within 24 hours and create a notification and complaint process for disputing content moderation decisions.
- A Democratic bill called the Protecting Americans from Dangerous Algorithms Act would hold large social media platforms with 50 million or more users liable for using algorithmic systems that amplify “radicalizing” content that leads to offline harms.
- A Republican bill called the Limiting Section 230 Immunity to Good Samaritans Act is aimed at holding big tech firms accountable for what is construed as discriminatory decisions to censor and kick people off their platforms. It would stop large tech firms from discrimination when enforcing terms of service and allow users to sue them for breaching their “contractual duty of good faith.”
With Democrats in control of Congress and the White House, if any Section 230 reforms do happen this year, they are likely to be focused primarily on the areas Democrats have emphasized such as addressing extremist content and messaging that leads to harmful behavior.
More in Media
Media Briefing: Efforts to diversify workforces stall for some publishers
A third of the nine publishers that have released workforce demographic reports in the past year haven’t moved the needle on the overall diversity of their companies, according to the annual reports that are tracked by Digiday.
Creators are left wanting more from Spotify’s push to video
The streaming service will have to step up certain features in order to shift people toward video podcasts on its app.
Digiday+ Research: Publishers expected Google to keep cookies, but they’re moving on anyway
Publishers saw this change of heart coming. But it’s not changing their own plans to move away from tracking consumers using third-party cookies.