European regulators are turning up the heat on U.S. technology platforms this year. Next on the list: cracking down on hate speech.
Germany has led the way. In January 2018, it became the first country to introduce a law that requires platforms to take down suspected illegal content within 24 hours or seven days, depending on the charge, or risk €50 million ($56.5 million) fines. The law was named The Network Enforcement Act, or NetzDG, and every six months the platforms are required to report their findings.
As with data infringements, antitrust violations or copyright-law breaches, 2019 has so far proven to be a year in which European regulators have become more aggressive around levying fines on the platforms for violating hate speech. It’s been 18 months since Germany introduced NetzDG, and other countries are following now suit.
Here’s what to know about the progress of anti-hate speech laws.
Facebook’s €2 million ($2.3 million) violation fine
In Germany last week, the Bundesamt für Justiz, Germany’s Federal Office of Justice, announced it plans to fine Facebook €2 million for allegedly failing to comply with how it reports the number of complaints it gets, part of the obligations set out in the NetzDG law.
Last summer, after Facebook, Twitter and Google released their first reports, Facebook logged only 1,704 complaints, significantly lower than Twitter’s 264,000 and YouTube’s 215,000. The reason: Facebook has made its NetzDG reporting form too hard to find, according to the BfJ. Facebook has always had a clear way to report posts that don’t meet its community guidelines, but without access to how many complaints were made here, the BfJ claimed Facebook is distorting the picture about the amount of illegal content and how it’s dealt with.
In an emailed statement, a Facebook spokesperson said the platform had no objections from the BfJ during “constructive discussion” before the law was implemented. “We are confident our published NetzDG reports are in accordance with the law but as many critics have pointed out there are a number of areas where this law lacks clarity. We will analyze the fine notice carefully and reserve the right to appeal.”
The BfJ also hit out at Facebook on three other violation counts, including the level of training of those who handle the complaints — Facebook said it employs 63 people to check NetzDG’s complaints — and the way it then handles those complaints.
The number of NetzDG complaints is decreasing
This month, the platforms will release their third report under Germany’s NetzDG. In January 2019, Facebook, Google and Twitter detailed their progress for the previous six months.
Between the period of July to December 2018, Facebook received 1,048 complaints, compared to 1,704 in the previous six months. Of these, 35% were deleted or blocked, compared to 21% in the previous six months.
For the same time period, YouTube received 168,000 complaints, compared to 215,000 the previous six months. Over 95% were blocked with 24 hours.
Hate speech laws in other countries follow
This week, France followed Germany’s lead in saddling U.S. tech platforms with a time limit for removing “hateful” content before being fined €1.25 million ($1.41 million).
The size of these fines isn’t likely to make the platforms quake. For instance, they’re not as weighty as the €1.5 billion ($1.7 billion) antitrust fine Google faced in March. But they are symbolic. “[The fines] are still very significant,” said Oliver Fairhurst, senior associate at law firm Lewis Silkin. The platforms have also shown a willingness to comply with the laws, which reflects the fact that they don’t want hate speech on their platforms either, he added.
The U.K. government is also keen to regulate online content and inching closer: In April, the government released a proposal to tackle “online harms.” The U.S. too has a much closer eye on how Germany’s skepticism with the tech platforms will play out before judging how to apply its own regulatory measures regarding hate speech.
The laws remain imperfect
As the French and German lawmakers have found, pinning down “hateful” and “illegal” content without making platforms the arbiters of free speech is a tricky route to navigate. This week, French politicians debated the definitions into the night and agreed to include condoning crimes against humanity, but specific references to anti-Zionism and hate against the state of Israel were rejected from the final text.
Another sticking point is the time frame: Content spreads quickly on social platforms, yet in France, Facebook said 24 hours to remove content is too short as this type of content requires more detailed analysis.
Ultimately, forcing platforms to regulate content involves them adjudicating on often complex legal issues. Applying concepts like “defamation” to particular statements requires detailed legal knowledge and experience as well as access to background information. Platforms don’t always have the legal know-how to work out these issues.
“The question is whether platforms such as Facebook should be in that position, or whether the courts should have sole responsibility for removing anything other than clearly unlawful content,” said Fairhurst.
More in Media
Media Briefing: Efforts to diversify workforces stall for some publishers
A third of the nine publishers that have released workforce demographic reports in the past year haven’t moved the needle on the overall diversity of their companies, according to the annual reports that are tracked by Digiday.
Creators are left wanting more from Spotify’s push to video
The streaming service will have to step up certain features in order to shift people toward video podcasts on its app.
Digiday+ Research: Publishers expected Google to keep cookies, but they’re moving on anyway
Publishers saw this change of heart coming. But it’s not changing their own plans to move away from tracking consumers using third-party cookies.