LOADING...

Can AI be held responsible for a user's death?

Technology

A 16-year-old named Adam Raine died by suicide earlier this year, and now his parents are suing OpenAI.
They say ChatGPT-4o, which Adam chatted with for months, didn't do enough to help or stop him—even though it has safety features meant to protect users.
The lawsuit is raising questions about how well AI can really look out for vulnerable teens.

Adam's parents believe ChatGPT's responses contributed to his death

Adam managed to get around ChatGPT's safety warnings by saying his questions were for a fictional story.
His parents believe this let harmful conversations slip through and contributed to his death.
OpenAI admits its safeguards work better in short chats than long ones and says it's still working on improvements.
This case is the first of its kind against OpenAI, coming as more lawsuits target other AI platforms over similar concerns about user safety.