LOADING...

FTC probes AI chatbot interactions with minors

Technology

The US Federal Trade Commission (FTC) is digging into how companies like Google, OpenAI, and Meta handle their AI chatbots when it comes to young users.
The big question: are these companies doing enough to keep kids safe and prevent harmful behavior when minors use their bots?

ChatGPT linked to a teen's suicide plan

The FTC is using its authority to collect data on company practices, especially around collecting info from kids under 13—which is illegal without parental consent.
There's growing pressure after a lawsuit linked ChatGPT to a teen's suicide plan, putting the spotlight on whether current protections go far enough for teenagers too.
Depending on what the FTC finds, this could inform future investigations or regulatory actions regarding tech companies.