Minors have now sued an AI company over CSAM
Elon Musk's AI company xAI is facing a class-action lawsuit after its Grok AI model was allegedly used to create child sexual abuse material (CSAM).
Filed in California, the suit says Grok let users generate explicit images of minors by altering photos taken from social media and messages.
Some of these fake images were then shared on platforms like Discord and Telegram.
One teen found out their high school photos had been turned into nudes; another only learned about the edits from investigators after the perpetrator was arrested in December 2025.
Lawsuit aims to represent thousands of minors
Three anonymous plaintiffs are leading the case, but it aims to represent thousands of minors whose images were altered using Grok.
This is reportedly the first time minors have sued an AI company over CSAM.
The lawsuit claims xAI ignored standard safety measures and designed Grok in a way that allowed millions of sexualized images, including 23,000 involving children, to be created in just 11 days.
State attorneys general have also demanded xAI remove all nonconsensual content and add stronger safeguards to prevent future harm.