‘Before, it was a black box’: Platforms report how they delete illegal content in Germany
It’s been six months since Germany started enforcing a new law holding social networks accountable for the content they host, and last week, Facebook, YouTube and Twitter reported on the content they’ve deleted or blocked in that country so far. With similar regulation being considered in the U.K., politicians there are closely watching how German lawmakers are working with U.S. tech platforms.
The Network Enforcement Act, or NetzDG, requires platforms to take down suspected illegal content within 24 hours or seven days, depending on the charge, or risk eye-watering €50 million ($60 million) fines. Every six months the platforms are required to report their findings. Here are the headline numbers:
- Twitter received 264,000 complaints and blocked or removed 11 percent of the reported content.
- YouTube received 215,000 complaints and removed or blocked 27 percent of the reported content, most of which was hate speech or political extremism.
- Facebook received 1,704 complaints and removed or blocked 21 percent.
- YouTube removed nearly 93 percent of the content that needed to be blocked within 24 hours of receiving the complaint.
- In 637 cases, Twitter took longer than 24 hours to remove content.
- Most of Facebook’s content deletion took place within 24 hours and 24 took more than a week
- YouTube asked for help from an outside law firm specializing in criminal law 40 times.
“The law shows the players are taking obligations seriously,” said Ruben Hofmann, a partner at law firm Heuking Kühn Lüer Wojtek. “Before the publication of the reports, it was a black box. The free word is well protected by Germany’s constitution and civil code. If we ask platforms to delete content, then the boundaries must be quite high, that it’s reached a criminal act.”
Critics of the law feared that platforms would go overboard in blocking content to avoid fines, threatening free speech. The law was imprecise in saying what content is illegal, which also gives platforms a lot of leeway.
With this being the first set of reports and each platform having different reporting tools, it’s hard to know if those fears are warranted.
For example, Facebook had a comparably low number of complaints, which critics suggested is because its NetzDG reporting feature is separate from its existing system to report violations of its community standards, while YouTube and Twitter baked those reporting features into their existing system for reporting violations.
The fact that the platforms blocked a small percentage of the content that was brought to their attention suggests they’re not overreacting, but it’s hard to tell without looking at each case on its own merits, said Hofmann.
The next set of reports due in six months will be looked at to see if there’s any change in how aggressively the platforms are taking down content. Germany’s Federal Office of Justice has said these reports will be used to reevaluate the law in 2020.
A neutral body set up and financed by the tech players but with governmental oversight would improve the current law, Hofmann said. “We don’t want private institutions to decide if content is criminal.”
More in Media

WTF is headless browsing, and how are AI agents fueling it?
AI agents are putting headless browsing back in the spotlight. For media companies, that raises questions: How much traffic is real vs. automated?

How People Inc. is prioritizing traffic and revenue diversification to prepare for AI era
People Inc is preparing for AI’s impact on search and content discovery by focusing on traffic and revenue diversification and direct to consumer relationships.

One year in, Business Insider’s AI onsite search is boosting engagement
Although Business Insider’s AI search tool is currently only used by roughly one percent of Business Insider’s readership, it has significantly increased the engagement of those who do use the tool, with click-through to articles increasing by 50 percent since October.