On Monday, EU countries and lawmakers failed to reach an agreement on extending the interim derogation that allows online platforms to voluntarily detect and remove child sexual abuse material.
Back when the e-Privacy Regulation was negotiated, EU policy-makers already saw the value of proactive detection to prevent the dissemination of child sexual abuse online. This is why a temporary exception was included: the Interim Regulation. This exception has allowed online platforms to detect, report and remove millions of images and videos depicting the sexual abuse of children. It has allowed them to help victims heal and prevent further abuse from happening.
Set to expire on 3 April, the European Commission proposed an extension to allow for this detection to happen while EU policy-makers continue to negotiate a permanent framework for online platforms to effectively prevent the dissemination of CSAM in their services. However, the lead negotiators from the European Parliament, MEP Birgit Sippel, and the Council of the EU, the Cypriot Presidency, failed to reach an agreement, leaving children less protected than they are today.
This gap in protection was entirely avoidable and policymakers were fully aware of the consequences.
ECLAG Coalition
Consequences of Political Inaction
In 2021, when a similar legal gap occurred, reports of child sexual abuse material dropped by 58%—not because abuse decreased, but because detection efforts were significantly limited. The main disagreement between negotiators was the scope: while the European Parliament wanted to limit the scope of detection to known CSAM and target detection to suspected users, the Council argued to keep the original scope.
The sheer scale of child sexual abuse material circulating online demands a bold response as over the past years, 99% of CSAM reported were submitted by online platforms using detection technology. The burden of prevention should not fall on children, expected to report abuse, but on the online platforms that enable it. We support proactive detection of all forms of CSAM as a critical response to safeguard children and we are deeply concerned to see policymakers fall short in such a critical moment.
We echo the words from the ECLAG coalition, saying that “this gap in protection was entirely avoidable and policymakers were fully aware of the consequences.” Urgent action is needed to ensure platforms can continue to detect and report abuse, while respecting fundamental rights, and to deliver a sustainable long-term framework that effectively protects children online.
Read ECLAG’s reaction here.
Senior EU Policy Officer for Tech Policy, Child Helpline International