Critics say Germany’s hate speech law comes at a price

Germany’s controversial law requiring the tech platforms to remove hateful content has turned up only a few offenders in the last six months. But some worry that this is because the platforms have gone overboard in blocking questionable content to avoid fines.

The Network Enforcement Act, or NetzDG, was passed before Germany held its federal election last September and took effect in January. It requires platforms like Facebook, Twitter and YouTube to take down perceived illegal content within 24 hours or seven days, depending on the charge, or risk eye-watering €50 million ($60 million) fines.

Regulation of tech platforms can become a blunt tool for an issue that requires nuance. The law is vague on what constitutes hateful content, leaving platforms to make the final call. And with the risk of big fines, tech giants have reason to be more zealous with content takedowns.

“It’s like taking a sledgehammer to fix a wristwatch rather than a surgical instrument,” said Scott Vernick, a partner at law firm Fox Rothschild in the U.S.

Germany’s Federal Office of Justice said that by May it had received under 400 complaints of content that needed to be removed by platforms, far below its estimates of 25,000 complaints and 500 fine proceedings a year.

The law requires companies to publish transparency reports every six months, detailing how many requests they had and how the law has impacted them. The first will be published July 31. Until then, though, that low number has led to speculation that the platforms have been more zealous in blocking content, potentially impinging on free speech.

“We are seeing the effects of NetzDG, particularly as the larger platforms have started blocking content,” said Felix Hilgert, a senior associate at German law firm Osborne Clarke.

Members of far-right and far-left political parties alike oppose NetzGD. Germany’s left-leaning party Alliance 90/The Greens said the low number is evidence that users believe a federal authority shouldn’t be in the role of judging speech. How easy platforms have made reporting hate speech will also impact the number of complaints.

The law has led to other concerns. Politicians of far-right group Alternative for Germany have had content temporarily blocked for inciting racial hatred. Germany’s satirical magazine Titanic had its Twitter account suspended for two days for parodying AfD’s tweets. Some worry the law will give more exposure to hate speech that it was designed to suppress and that deleting questionable content will also destroy evidence that might be needed should a case come to court.

“The law is hard to implement,” Hilgert said. “I’m not aware of any case law yet that would help provide guidance to social networks on where to draw the line on certain issues. The practical difficulties were wholly foreseeable.”

The platforms have been staffing up to weed out hate speech. Google said it’s on track to meet its commitment to add 10,000 people globally to fight hate speech, including some NetzDG reviewers based in Hamburg. The platform has been using a combination of human and machine learning to flag violent and extremist content. Since December, it has been slowly training machines to automatically recognize hate speech, but it’s complicated by the fact that videos promoting terrorism tend to follow a formula while hate speech draws on traits like context, slang and intent.

Globally, Facebook is doubling the size of its safety, security and content review teams from 10,000 to more than 20,000 over the course of this year, with 1,500 people based in Germany reviewing content on behalf of the platform. We asked Twitter for comment; we’ll update this story when it responds.

Germany’s anti-hate speech law will draw more scrutiny after U.K. media watchdog Ofcom last week backed calls for greater governmental regulation on the online platforms. Platforms have avoided the same regulation as broadcasters and newspaper because they claim they are not strictly publishers. Ofcom has been cautious about regulation for fear of wading into the tangled free-speech debate. Ofcom plans to set out fuller ideas about regulation in the fall, but as Germany’s example shows, parliament will need to balance of freedom of expression while fighting content that is abhorrent to the ideals of a free and democratic society, said Vernick.

“One has to be realistic with the time frame for investigating posts,” he said. “The law shouldn’t extend beyond the nation state’s border.”

 

https://digiday.com/?p=295400

More in Media

Earnings from social and search players signal that AI will be a long-play investment

Giants like Google, Meta and Microsoft say investors and advertisers might have to wait longer for AI to generate a better return on investment.

Why some publishers aren’t ready to monetize generative AI chatbots with ads yet

Monetization of generative AI chatbot experiences is slow going. Some publishing execs said they’re not ready to add advertising to these products until they scale or can build a subscription model first.

Media Briefing: Publishers who bet on events and franchises this year are reaping the rewards

Tentpole events and franchises are helping publishers lock in advertising revenue.