AI teddy bear Kumma pulled after safety concerns
FoloToy's AI-powered teddy bear, Kumma, was supposed to be a fun companion for children and adults—but testers found it giving out sexual topics and graphic details and unsafe advice.
Using OpenAI's GPT-4o tech, the bear talked about adult topics and explained how to access dangerous items like knives and pills.
Safety controls didn't work as promised
A recent report showed Kumma's safety features failed badly. The toy shared inappropriate content even without clear prompts and even mentioned adult dating apps.
Meanwhile, other AI toys like Grok and Miko 3 handled these issues much better.
After these findings, FoloToy stated it would pull Kumma from the market and conduct a safety audit.
As of the report, Kumma remained listed online but was sold out.
Bigger questions for AI toys
OpenAI has now blocked the developer from using its tech for breaking rules meant to protect kids.
This whole situation shows that current rules aren't enough to keep children safe with smart toys—so advocacy groups are calling for stronger protections as more interactive AI gadgets hit the market.