Instagram will alert parents if teens search self-harm topics
Instagram will start alerting parents next week if their enrolled teens often search for self-harm or suicide topics.
This new safety feature is rolling out first in the UK, US, Australia, and Canada.
Alerts will pop up via email, SMS, WhatsApp, or directly in the app whenever a teen's search patterns suggest rising concern.
Critics say this could cause unnecessary panic among parents
Some groups aren't convinced this is the right move.
The Molly Rose Foundation—set up after 14-year-old Molly Russell died by suicide following exposure to harmful content—called the alerts risky and said vague notifications cause parents to panic.
Their CEO Andy Burrows referenced earlier research by the Molly Rose Foundation that found Instagram still actively recommends harmful content.
Meta plans to expand this feature to more countries soon
Meta disagrees with the criticism and says the report is misleading. They're also planning new warnings for chats with AI bots on their platforms.
After this initial launch in four countries, they plan to bring these features to more places soon.