Proposed EU Digital Services Act (DSA) – “notice and (NO) action”: Lessons (not) learned from testing the content moderation systems of very large social media platforms (CEP Policy Brief)

Alexander Ritzmann

CEP Policy Brief
June 2021

Download link

Key findings of independent monitoring reports
1) „notice and action“ systems seem to not work properly
Based on six independent monitoring reports, the overall average takedown rate of illegal
content by very large platforms (gatekeepers) based on user notices is 42%. This finding could
be considered a disprove of concept for voluntary content moderation, because if even
reported illegal content is mostly left online, the implications for legal but harmful content are
obviously very negative.


2) “trusted flaggers” might be too involved with the platforms they monitor
Trusted flaggers, which are supposed to play a key role in the notice and action framework,
are often underfunded and to some degree dependent on the platforms they are monitoring.
There are also indications that in some cases the platforms were aware of upcoming
monitoring activities which might have influenced the overall monitoring results.

Recommendations for the draft DSA
1) (Article 19) – Ensure financial independence of trusted flaggers by creating an EU wide
monitoring fund which is financed by contributions of the companies falling under the DSA in
proportion to their average monthly users in the EU. This EU DSA monitoring fund should be
administered by the EU Commission or the European Board for Digital Services, not by the
companies which would be monitored nor by EU Member States.


2) (Article 7) – Ensure the protection of EU citizens online from illegal extremist/terrorist content
by mandating gatekeepers to use proactive measures with strict rules on transparency,
auditability and effectiveness of the applied automated decision making systems. This
approach will protect civil liberties of users much more than trusting the voluntary efforts of the
companies. According to gatekeepers, the sheer amount of content forces them already to
extensively apply proactive technical measures like upload- and re-upload filters to tackle
illegal (or unwanted) content.

Kommentar verfassen

Bitte logge dich mit einer dieser Methoden ein, um deinen Kommentar zu veröffentlichen:

WordPress.com-Logo

Du kommentierst mit Deinem WordPress.com-Konto. Abmelden /  Ändern )

Google Foto

Du kommentierst mit Deinem Google-Konto. Abmelden /  Ändern )

Twitter-Bild

Du kommentierst mit Deinem Twitter-Konto. Abmelden /  Ändern )

Facebook-Foto

Du kommentierst mit Deinem Facebook-Konto. Abmelden /  Ändern )

Verbinde mit %s