LOADING...

Roblox's new AI tool scans chats for predatory language

Technology

Roblox just rolled out Sentinel, an open-source AI tool that looks for predatory language and grooming in game chats.
This comes after the company faced lawsuits over user safety.
Sentinel has already helped flag about 1,200 cases of child exploitation to authorities in 2025.

How Sentinel works

Sentinel scans one-minute slices from billions of daily messages on Roblox.
Using machine learning, it sorts chats as safe or risky and watches for patterns that might signal danger.
If something seems off, human moderators step in and can alert law enforcement if needed.

What's next for Roblox?

By sharing Sentinel with everyone, Roblox hopes more platforms will use smart tools to protect young users.
The system even keeps an eye on private (non-encrypted) chats so moderators can act fast if trouble pops up.
It's a big step toward making online spaces safer for millions of kids and teens who love to play and connect on Roblox.