A critical European Union law that allowed major technology companies to proactively scan their platforms for child sexual abuse material has expired, creating a regulatory void that safety experts warn will leave children vulnerable and crimes undetected.
The temporary measure, which served as an exception to broader privacy rules, permitted the use of automated detection tools to identify harmful content such as child sexual abuse material (CSAM), grooming, and sextortion within private messages. Its expiration on April 3 followed a decision by the European Parliament not to extend it, citing ongoing debates over privacy implications.
This lapse creates a significant dilemma for technology firms. While scanning user communications is now prohibited under the expired carve-out, companies remain legally obligated under the separate Digital Services Act to remove any illegal content hosted on their services. In a joint statement, Google, Meta, Snap, and Microsoft announced they would continue voluntary scanning efforts for CSAM, calling the legislative inaction “an irresponsible failure” that undermines established child protection work.
Child safety organizations have sounded the alarm, predicting a sharp decline in abuse reports. They point to a precedent from 2021, when a similar legal gap led to a 58% drop in reports from EU-based accounts to the U.S. National Center for Missing & Exploited Children (NCMEC) over an 18-week period.
“When detection tools are disrupted, we lose the visibility that is critical to finding and protecting victims,” said a representative from NCMEC, which received over 21 million reports globally last year. “The abuse does not stop when detection goes dark.”
The implications are international. Perpetrators of cross-border crimes, including sextortion schemes where individuals are blackmailed with intimate images, may exploit this new uncertainty in European digital safeguards.
The core conflict delaying a permanent EU law pits the imperative to protect children against concerns over privacy and mass surveillance. Privacy advocates have criticized scanning mandates as a form of “chat control” that threatens fundamental rights. Safety experts counter that the specific technology in question does not equate to indiscriminate surveillance.
“The technology does not store data or review private communications,” explained a policy director from Thorn, a non-profit that develops detection tools. “It uses digital fingerprints of known abuse material to block matching uploads automatically. Identifying child sexual abuse is not an infringement on free speech.”
Negotiations for a permanent legislative framework have been ongoing for four years, with the Parliament stating it is a priority but offering no timeline for a resolution. Critics argue the current impasse sends a dangerous signal.
“By allowing this gap, the EU is effectively risking open doors for predators,” said a policy head from a child safety foundation. “If there is a serious commitment to protecting children online, agreeing on a permanent, balanced law cannot wait.”
