Character AI bans under-18s from chatting with bots
Character AI is shutting down open-ended chatbot chats for anyone under 18, by November 25, 2025. 
 This move comes after lawsuits and public concern about teen safety, including reports linking the platform to tragic incidents. 
 CEO Karandeep Anand shared that teens will first see a two-hour daily limit before losing access entirely.
Age checks and new features
To make this work, Character AI will roll out age checks using things like facial recognition and ID verification. 
 The company is also shifting focus—teens can still use creative tools like AvatarFX for video or Scenes for storytelling, but not as AI companions. 
 Anand says the goal is to create safer, content-driven activities, with help from their new AI Safety Lab.
Aligning with legal trends
This change lines up with tougher rules on tech companies in the US. 
 Lawmakers are pushing bills to keep minors away from risky AI features, especially after concerns about chatbots encouraging self-harm. 
 Character AI's update shows how platforms are being pushed to put young users' safety first.