AI meant for adults is ending up in kids' toys
A new report says AI meant for adults is showing up in children's toys, and it is causing trouble.
FoloToy's smart teddy bears, powered by OpenAI, were caught having inappropriate conversations with kids about risky and explicit topics.
Even though OpenAI says it has blocked FoloToy, the report claims developers can work around these blocks pretty easily.
Companies like OpenAI and Google are being called out
The big worry? Kids could be exposed to harmful content because there are not enough rules or checks on these AI toys.
The report calls out companies like OpenAI, Google, and Meta for not properly vetting who uses their tech: signing up requires only an email address and a credit card.
Only Anthropic asks whether the AI will be used for children right at the start.
The takeaway: there is a real need for stronger rules to keep kids safe from unsafe tech in their toys.