TikTok shows explicit content to '13-year-old' users: Investigation
A new investigation by Global Witness found that TikTok's algorithm is recommending sexual and explicit videos to accounts set up as 13-year-olds—even when "restricted mode" is turned on.
The fake teen profiles didn't search for anything, but still got shown inappropriate content without doing any searches themselves.
Researcher Ava Lee summed it up: "TikTok isn't just failing to prevent children from accessing inappropriate content—it's suggesting it to them as soon as they create an account."
Research team urges regulators to step in
The report points to a bigger issue with TikTok's content moderation, especially since the Online Safety Act's Children's Codes came into force on July 25, 2025.
Harmful material was found hidden inside otherwise normal posts and kept slipping past content moderation.
Lee's team has been looking into TikTok's child safety features this year and keeps finding the same problems, urging regulators to step in.
TikTok claims to be committed to age-appropriate experiences
TikTok says it's committed to age-appropriate experiences and has taken steps to remove violating content and improve its recommendations.
But with ongoing criticism about child safety, many are watching closely for real changes.