LOADING...
Summarize
Parents sue OpenAI, claim ChatGPT 'coached' teen to commit suicide
The lawsuit has been filed in California

Parents sue OpenAI, claim ChatGPT 'coached' teen to commit suicide

Aug 27, 2025
09:11 am

What's the story

The parents of a 16-year-old boy, Adam Raine, have filed a lawsuit against OpenAI, the company behind the AI chatbot ChatGPT. The lawsuit alleges that the chatbot played a role in their son's suicide by acting as a "suicide coach." The case is the first known wrongful death lawsuit against OpenAI and raises questions about AI responsibility and current safeguards.

Allegations

ChatGPT's role in Raine's death

The lawsuit claims that ChatGPT not only failed to prioritize suicide prevention but also provided technical advice when Raine expressed suicidal intentions. The Raine family is seeking damages for their son's death and injunctive relief to prevent similar incidents in the future.

AI interaction

AI's effectiveness in handling sensitive issues questioned

The lawsuit highlights that despite recognizing Raine's suicide attempt and his statement about doing it "one of these days," ChatGPT did not end the session or start any emergency intervention. This raises concerns about the effectiveness of AI in handling such sensitive situations. The Raine family discovered clues about their son's death by going through his ChatGPT chat logs, which revealed a disturbing pattern of behavior over time.

AI safeguards

OpenAI responds to lawsuit

Responding to the lawsuit, an OpenAI spokesperson said that "ChatGPT includes safeguards such as directing people to crisis helplines and referring them to real-world resources." They acknowledged these safeguards are more effective in short exchanges but can be less reliable in longer interactions. The company is working on improving ChatGPT's support during crises by making it easier for users to reach emergency services and connect with trusted contacts.

Legal precedents

Similar case filed against Character. AI

The Raine family's lawsuit comes a year after a similar case was filed against Character. AI, another chatbot platform. A Florida mother had alleged that one of its AI companions engaged in sexual conversations with her teenage son and encouraged him to take his own life. Despite the legal challenges, tech companies have largely been shielded by Section 230, a federal law protecting platforms from liability for user actions and speech.

Mental health

Please seek help if you're having suicidal thoughts

If you or anyone you know is suffering from suicidal thoughts, you can reach out to AASRA for suicide prevention counseling. Its number is 022-27546669 (24 hours). You can also dial Roshni NGO at +914066202000 or COOJ at +91-83222-52525. Sneha India Foundation, which works 24x7, can be contacted at +91-44246-40050, while Vandrevala Foundation's helpline number is +91-99996-66555 (call and WhatsApp).