NetzDG 2.0 – Recommendations for the amendment of the Network Enforcement Act (NetzDG) and Investigation into the actual blocking and removal processes of YouTube, Facebook and Instagram

Policy Paper, Counter Extremism Project (CEP) Berlin, March 2020,

Autors: Alexander Ritzmann (Senior Advisor), Dr. Hans-Jakob Schindler (Senior Director) und Marco Macori (Research Fellow)

The paper can be downloaded  here.

The Counter Extremism Project (CEP) Berlin conducted a new study between January 31st and February 14th to test big tech’s compliance with Germany’s 2018 NetzDG online content moderation law. The law in its current form requires online platforms to remove “manifestly illegal” content within 24 hours only after it has been reported by users. CEP’s study revealed that YouTube, Facebook, and Instagram removed a mere 43.5 percent of clearly extremist and terrorist content, even after that material was reported for their illegal nature under the NetzDG.

As first reported by Die Welt, the study found that of the 92 “manifestly illegal” instances of online content flagged by CEP, only 43.5 percent were blocked or deleted by the platforms. Of those companies studied, YouTube has been least compliant with the law’s requirements. The company blocked only 35 percent of the 80 videos that were reported and should have been blocked. Facebook and Instagram deleted or blocked all of the flagged content, but Facebook did not remove any content that was explicitly not flagged—even though that content contained the same reported illegal symbols. CEP Berlin’s findings suggest that this “notice and takedown” method for removing illegal content can only be effective if platforms are being searched continuously and systemically for such material.

German lawmakers are currently discussing several amendments to the NetzDG. The study underlines recommendations which CEP Berlin has published in a recent NetzDG policy paper. In particular, it is clear that passive and reactive approaches to removal illegal content are insufficient. Further, the low blocking percentage of reported content shows that more transparency and auditability from tech companies is needed to explain and improve the inadequate compliance rate.


Kommentar verfassen

Bitte logge dich mit einer dieser Methoden ein, um deinen Kommentar zu veröffentlichen:

Du kommentierst mit Deinem Abmelden /  Ändern )

Google Foto

Du kommentierst mit Deinem Google-Konto. Abmelden /  Ändern )


Du kommentierst mit Deinem Twitter-Konto. Abmelden /  Ändern )


Du kommentierst mit Deinem Facebook-Konto. Abmelden /  Ändern )

Verbinde mit %s