UK to allow testing AI that creates child sexual abuse material
The UK is introducing a new law letting tech companies and child safety groups test AI tools that could be used to make child sexual abuse material (CSAM).
This comes after a sharp rise in reports of AI-generated CSAM—from 199 cases last year to 426 this year.
The goal is to catch problems early by making sure experts check these AI models and build in strong safeguards.
New law part of crime and policing bill
This law is part of updates to the crime and policing bill, and it also bans possessing, creating or distributing any AI models for CSAM.
Kanishka Narayan, the minister for AI and online safety, said it's about spotting risks before they cause harm.
Groups like Childline have noticed more kids mentioning AI in counseling sessions, showing why these protections are needed now more than ever.