LOADING...
Summarize
Character.AI to ban users under 18 amid legal scrutiny
Character.AI to ban minors starting late November

Character.AI to ban users under 18 amid legal scrutiny

Oct 30, 2025
12:51 pm

What's the story

Character.AI, a leading chatbot company, has announced plans to ban users aged 18 and under from interacting with its virtual companions, beginning in late November. The decision comes in the wake of legal scrutiny and concerns over the impact of AI companions on mental health. The company allows users to create characters for open-ended conversations, but is now facing lawsuits over a child's suicide allegedly linked to its technology.

Policy update

Regulatory scrutiny and concerns prompt user age change

In its announcement, Character.AI said it is making these changes "in light of the evolving landscape around AI and teens." The company has been facing questions from regulators about the content teens may encounter while chatting with AI. It also expressed concerns over how open-ended AI chat could affect teens, even when content controls work perfectly.

Legal challenges

Lawsuit over emotional attachment leading to suicide

The policy change follows a lawsuit filed by the family of 14-year-old Sewell Setzer III, who took his own life after developing an emotional attachment to a character he created on Character.AI. His family blamed the company for his death and called its technology "dangerous and untested." Since then, more families have sued Character.AI with similar allegations.

Future plans

Age assurance functionality to be introduced

Along with the user ban, Character.AI plans to introduce an "age assurance functionality" by November 25. This feature would ensure that users receive an appropriate experience based on their age. The company said it doesn't take this step lightly but believes it's necessary given the concerns raised about how teens interact with this new technology.

Industry concerns

Similar lawsuits against other chatbot companies

Character.AI isn't the only company under fire for its chatbot's impact on mental health, especially among younger users. Earlier this year, the family of 16-year-old Adam Raine sued OpenAI, alleging it prioritized user engagement over safety. In response to these concerns, California became the first state to pass an AI law with safety guidelines for minors in October 2025. The law is set to take effect next year.

Legislative action

Proposed legislation in Congress addressing the issue

On a national level, Senators Josh Hawley and Richard Blumenthal proposed a bill to bar minors from using AI companions like those on Character.AI. The legislation also calls for companies to implement an age-verification process. "Chatbots develop relationships with kids using fake empathy and are encouraging suicide. We in Congress have a moral duty to enact bright-line rules to prevent further harm from this new technology," Hawley said.