Facebook, Twitter and other social media companies have been given an ultimatum by the European Union: rid your platforms of hate speech or face legal consequences.
European regulators have been pushing social media firms to remove racist and violent posts from their platforms in a timely manner for years. Their patience is running out.
Facebook, Twitter, Microsoft and Google have all pledged to do more. In May 2016, they promised to review a majority of hate speech flagged by users within 24 hours and to remove any illegal content.
But the European Commission, EU’s top regulator, said Thursday they are still failing to act fast enough. It said it would pass laws allowing the EU to impose punishments on companies that fail to act.
“The situation is not sustainable: in more than 28% of cases, it takes more than one week for online platforms to take down illegal content,” said Mariya Gabriel, the EU’s top official in charge of the digital economy and society.
The Commission said it will consider implementing new laws to tackle the problem if the online platforms fail to “take swift action over the coming months.”
It said it wants the companies to invest more in detecting of hate speech, and work with trusted reviewers who are trained to know what constitutes hate speech.
It also wants companies to do a better job of preventing illegal content from reappearing.
The punishments could be severe. The EU has a reputation for hitting companies that don’t play by its rules hard.
Earlier this year, it ordered Google to pay $2.8 billion in an antitrust fine. On Wednesday, it announced a $1 billion penalty for truck manufacturer Scania for participating in a cartel.
Several European countries aren’t waiting for the EU to act. They’re already pushing through strict laws punishing social media companies for being too lax when it comes to illegal hate speech.
The German government approved a plan in April to start imposing fines of as much as €50 million ($59 million) on Facebook, Twitter and others if they fail to remove hate speech and fake news posts within 24 hours after being flagged. Other illegal content needs to be deleted within 7 days of reporting.
In the U.K., a parliamentary committee has accused social media firms of prioritizing profit over user safety by continuing to host unlawful content. The committee called for “meaningful fines” if the companies do not improve fast.