LOADING...
Summarize
TikTok pushed explicit content to child accounts despite safety settings
TikTok's algorithm is recommending pornographic and highly sexualized content: Report

TikTok pushed explicit content to child accounts despite safety settings

Oct 03, 2025
10:35 am

What's the story

A recent report from Global Witness, a human rights campaign group, has revealed that TikTok's algorithm is recommending pornographic and highly sexualized content to child accounts. The researchers created fake child accounts with safety settings activated, but still received sexually explicit search suggestions. These terms led to sexualized material, including explicit videos of penetrative sex.

Investigation details

Researchers created 4 accounts on TikTok

In late July and early August, researchers from Global Witness created four accounts on TikTok posing as 13-year-olds. They provided fake dates of birth and were not asked to provide any other information. Even with the platform's "restricted mode" activated, which is supposed to block mature or complex themes like sexually suggestive content, the accounts were still recommended overtly sexualized search terms.

Content discovery

'You may like' section recommended explicit content

The researchers found explicit content in the "you may like" section of TikTok. This included videos of women simulating masturbation, flashing their underwear in public, or exposing their breasts. The most extreme cases even featured explicit pornographic films of penetrative sex embedded into other innocent-looking content to bypass content moderation mechanisms.

Shock and commitment

Findings 'huge shock' to researchers

Ava Lee from Global Witness said the findings were a "huge shock" to researchers. She added, "TikTok isn't just failing to prevent children from accessing inappropriate content, it's suggesting it to them as soon as they create an account." In response, TikTok reiterated its commitment to safe and age-appropriate experiences for users. The platform claims it has over 50 features aimed at keeping teens safe online.

Immediate response

TikTok took action after being informed of violations

After being informed by Global Witness about its findings, TikTok said it took action to "remove content that violated our policies and launch improvements to our search suggestion feature." The platform also noted that it removes nine out of 10 videos violating its guidelines before they are viewed. Despite these measures, the researchers found sexual content being recommended again in a follow-up investigation conducted after the Children's Codes came into force on July 25 this year.

Regulatory response

Children's codes mandate platforms to block pornographic content

The Children's Codes mandate platforms to use "highly effective age assurance" to prevent children from seeing pornographic content. They also require platforms to adjust their algorithms to block content promoting self-harm, suicide, or eating disorders. Ava Lee from Global Witness said, "Everyone agrees that we should keep children safe online... Now it's time for regulators to step in."