LOADING...

ChatGPT will soon start blocking suicidal thoughts, flirting

Technology

OpenAI is bringing age verification to ChatGPT, following a heartbreaking case earlier this year in April where a teen died by suicide after chatting with the bot.
The new system will use AI to check if users are between 13 and 18, aiming to make things safer for younger people.

Stricter content limits for teens

CEO Sam Altman says these updates are meant to protect teens online.
Teens will get stricter content limits—no chats about suicide or flirting—and if someone underage mentions suicidal thoughts, ChatGPT could alert parents or authorities.
Adults will also need to verify their age, and while some privacy may be reduced, OpenAI believes safety comes first.