LOADING...
Big Tech criticises EU after child abuse detection law expires
In 2025 alone, NCMEC received 21.3 million reports related to child abuse

Big Tech criticises EU after child abuse detection law expires

Apr 10, 2026
03:38 pm

What's the story

Google, Meta, Snap, and Microsoft have condemned the European Parliament's decision to not extend a law allowing big tech firms to search for child sexual exploitation on their platforms. The regulation, which was a part of the EU Privacy Act since 2021, allowed companies to use automated detection technologies to scan messages for harms such as child sexual abuse material (CSAM), grooming and sextortion.

Industry response

Tech companies express disappointment over European Parliament's decision

The law expired on April 3, and the European Parliament chose not to vote for its extension due to privacy concerns raised by some lawmakers. This has left a regulatory gap, creating uncertainty for big tech companies. In a joint statement on a Google blog, Google, Meta, Snap and Microsoft expressed their disappointment over the decision. "We are disappointed by this irresponsible failure to reach an agreement to maintain established efforts to protect children online," they said.

Legislative focus

Regulatory gap raises concerns about child safety

The European Parliament has said that it is prioritizing its work on legislation to prevent and combat online child sexual abuse. However, there is no timeline for agreements or implementation. Child protection advocates have warned that the lapse of this legislation could lead to a sharp decline in reports of child sexual abuse, similar to a legal gap in 2021 when reports from EU-based accounts fell by 58% over 18 weeks.

Advertisement

Detection concerns

Detection tools crucial for protecting child sexual abuse victims

John Shehan, VP at the National Center for Missing and Exploited Children (NCMEC), warned that disruption of detection tools could directly impact their ability to find and protect child sexual abuse victims. "When detection goes dark, the abuse doesn't stop," he said. In 2025 alone, NCMEC received 21.3 million reports with over 61.8 million images, videos and other files suspected of being related to child abuse from around the world.

Advertisement

Global impact

Global impact of EU's decision to stop scanning messages

Child safety experts warn that the EU's decision to ban scanning will have global repercussions. Many internet crimes are cross-border, with offenders sending illegal images or grooming children in other countries. "The offender can be anywhere in the world, but they could have unfettered access to minors in Europe now that there's legal uncertainty around those safeguards and protections to identify when a child is being groomed," Shehan said.

Regulatory debate

Proposed legislation has been under negotiation for 4 years

The proposed child sexual abuse regulation has been under negotiation for four years. It would require companies to take steps to mitigate risks on their platforms, a point of contention. Privacy advocates argue that big tech scanning messages for child abuse threatens fundamental privacy rights and data security for EU citizens, equating these measures to "chat control" that could lead to mass surveillance and false positives.

Advertisement