EU adopts controversial law forcing one-hour takedowns of terrorist content

EU adopts controversial law forcing one-hour takedowns of terrorist content
Share on facebook
Share on twitter
Share on pinterest

The European Parliament has formally adopted a law requiring internet companies to “remove or disable access to flagged terrorist content” within one hour after being notified by national authorities. Once issued, such takedown notices will apply across the EU, with countries able to levy financial penalties against firms that refuse to comply.

The legislation will come into force 12 months after it is published in the EU’s official journal, a standard step for all EU law. It will then have to be adopted by each member state.

The legislation has been under discussion in the EU for years. It was first proposed in 2018 at a time when terrorist attacks by groups like ISIS were peaking across the continent and lawmakers were worried about online radicalization. Since then the proposals have been making their way through the EU’s legislative bodies, with several major amendments.

Notably, the legislation now explicitly excludes takedowns targeting terrorist content that’s part of any educational, artistic, journalistic, or academic material. It also includes no obligation for internet companies to preemptively monitor or filter their content.

However, many MEPs and rights groups are still worried the legislation will have unintended effects. Although companies are under no obligation to filter content, say critics, they might choose to do so in order to avoid having to respond quickly to takedown notices. Such algorithmic filters could be over-zealous, worry experts, blocking legitimate content.

“To say that [using automated filters] is not an obligation is to allow it,” Gwendoline Delbos-Corfield, a Green MEP from France, told EURACTIV. Such technology might look particularly attractive in a “context where the algorithm is going to be cheaper than human means.”

Another concern is that the one-hour takedown limit is too short a time frame for smaller platforms with fewer resources. Such platforms are often used by terrorist groups to host content precisely because they can’t moderate content. This would make business harder for these companies, stifling competition in a market already dominated by US tech giants.

It’s also possible that EU member states could interpret what constitutes “terrorist” content in harmful ways. “This could open the way for authoritarian regimes, like those in Poland and Hungary, to silence their critics abroad by issuing removal orders beyond their borders,” writes one civil rights group. “Because this must happen within the hour, online platforms will have no option but to comply with these orders to avoid fines or legal problems.”


Subscribe to our Newsletter

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Share this post with your friends

Share on facebook
Share on google
Share on twitter
Share on linkedin

You sound like a bot

In 2018, a viral joke started going around the internet: scripts based on “making a bot watch 1,000 hours” of just about anything. The premise

Read More »

In defense of busywork

In the show Severance’s dystopian workplace — is there any other kind? — employees spend their days studying arrays of numbers bobbing on their screens.

Read More »

Leave a Reply

Your email address will not be published. Required fields are marked *