LOADING...
Summarize
Meta tightens AI chatbot guidelines amid child safety concerns
The move comes amid heightened FTC scrutiny of AI chatbots

Meta tightens AI chatbot guidelines amid child safety concerns

Sep 27, 2025
05:33 pm

What's the story

Meta has tightened its artificial intelligence (AI) chatbot guidelines. The move comes in the wake of growing concerns over child safety and protection from online threats. The Federal Trade Commission (FTC) has also intensified oversight of AI chatbots at Meta and other companies. An internal memo obtained by Business Insider reveals the company's revised policies for training its chatbot, especially on sensitive topics like child sexual exploitation.

Controversy

Chatbot previously criticized for sexualizing children

Earlier this year, Meta's chatbot was criticized for engaging in romantic or sexual conversations with children. The company later admitted that it was a mistake and quickly revised its policies. Now, the updated guidelines make it clear that Meta's chatbot will no longer participate in anything that could sexualize children. It is strictly prohibited from engaging in sexual roleplay involving minors or endorsing/describing any form of child sexual abuse.

Restrictions

It can still discuss sensitive issues

Under the new rules, the chatbot can still talk about sensitive issues like grooming or child sexual abuse, but only in an academic, preventive way. For instance, it might explain signs of grooming behavior but will never give explicit instructions on how to exploit or abuse children. The chatbot must reject any attempts at romantic or sexual roleplay with children and can only generate non-sexual romantic content in fictional narratives where all characters are 18 or older.

Assurance

Meta's communications chief reiterates commitment to child safety

Meta's communications chief, Andy Stone, has reiterated the company's commitment to keeping children safe from harmful content. He said, "Our policies prohibit content that sexualizes children and any sexualized or romantic role-play by minors." The updated guidelines are part of Meta's larger effort to show that it is taking safety seriously amid growing concerns over the protection of minors on its platforms.