A German law requiring social media companies like Facebook and Twitter to quickly remove hate speech from their sites is set to be revised following criticism that too much online content is being blocked, according to officials.
The law, which came into full force on Jan. 1, is the most ambitious effort by a Western democracy to control what appears on social media. It aims to ensure Germany’s tough prohibitions against hate speech, including pro-Nazi ideology, are enforced online by requiring sites to remove banned content within 24 hours or face fines of up to 50 million euros ($62 million).
The law, called NetzDG for short, is an international test case and how it plays out is being closely watched by other countries considering similar measures.
German politicians forming a new government told Reuters they want to add an amendment to help web users get incorrectly deleted material restored online.
The lawmakers are also pushing for social media firms to set up an independent body to review and respond to reports of offensive content from the public, rather than the individual companies doing that themselves.
Such a system, similar to how video games are policed in Germany, could allow a more considered approach to complex decisions about whether to block content, legal experts say.
The proposed changes follow widespread criticism from opponents of the law, including free speech campaigners and the Association of German Journalists, who say the threat of hefty fines is prompting internet firms to err on the side of caution and block more content than is necessary.
They point to several high-profile cases, including when a satirical magazine’s Twitter account was blocked after it parodied anti-Muslim comments.
Facebook, which says it has 1,200 people in Germany working on reviewing posts out of 14,000 globally responsible for moderating content and account security, said it was “not pursuing a strategy to delete more than necessary”.
“People think deleting illegal content is easy but it’s not,” said Richard Allan, Facebook’s vice president for EMEA public policy. “Facebook reviews every NetzDG report carefully and with legal expertise, where appropriate. When our legal experts advise us, we follow their assessment so we can meet our obligations under the law.”
Twitter declined to comment on how it is implementing the law, while Google’s YouTube said it would continue to invest heavily in staff and technology to comply with NetzDG.
Among other countries considering similar measures, France is looking at rules to block “fake news”, Britain is seeking to stop online harassment of politicians and Japan is looking to restrict suicidal posts after a suspected serial killer found his victims by trawling Twitter.
Lawmakers from Chancellor Angela Merkel’s conservative Christian Democrats and the left-leaning Social Democrats – parties poised to renew a governing coalition – are formulating changes to NetzDG.
“We will add a provision so that users have a legal possibility to have unjustly deleted content restored,” said Johannes Ferchner, spokesman on justice and consumer protection for the Social Democrats and one of the architects of the law.
He did not elaborate on how such a mechanism might work. At present, there is no formal mechanism for people to get content restored, beyond complaining to the social media firm involved.
Thomas Jarzombek, a Christian Democrat who helped refine the law, said the separate body to review complaints should be established, adding that social media companies were deleting too much online content.
NetzDG already allows for such a self-regulatory body, but companies have chosen to go their own way instead. According to the coalition agreement, both parties want to “develop” the law to encourage the establishment of such a body.
Germany’s opposition Free Democrats (FDP), who believe the decision over whether content should be removed should be in the hands of the courts rather than companies, said the proposed changes would not make a big difference.
German authorities have stressed, however, that they believe the law is working well in terms of forcing social media companies to delete offensive posts.
The Federal Office of Justice, which is responsible for administering the new law, said it had received 205 online complaints about sites failing to delete offensive posts in January and February, far fewer than expected. Officials had predicted 25,000 complaints a year.