Shocking! YouTube found recommending soft-core porn videos featuring minors
Once again, YouTube is facing backlash over its content moderation practices. The platform, according to a Reddit post from creator Matt Watson, has been found serving soft-core porn, with scantily clad minors, through its recommendation section. With such content, it has been giving pedophiles a way of exchanging inappropriate comments, allowing abusive gratification and exploitation of those kids. Here's more on the case.
What exactly is the issue
Watson discovered that a simple search for 'Bikini haul' content redirected him to videos of underage girls. YouTube's algorithms use previous video searches to recommend content, but in this case, Watson masked his browsing history using VPN and looked up for adult content. He found that within 10 minutes, sometimes not even after five clicks, YouTube's 'Up next' sidebar recommended videos of minor girls.
Wormhole into a soft-core pedophilia ring
With content like this, YouTube is serving as a hub to facilitate the exploitation of young kids, Watson emphasized. He found many recommended clips had comments enabled, which allowed pedophiles to share sexually-explicit remarks with timestamps to inappropriate parts of the clips. Further, he said, the videos are facilitating their "ability to connect with each other, trade contact info, and link to child pornography."
What type of content YouTube promoted?
The algorithms promoted a video named "sweet sixteen pool party" as well as similar content with scantily clad minor girls performing gymnastic poses and other activities. Note that the content isn't porn; these are innocent clips that are being repurposed by pedophiles as erotic content.
This is in stark contrast of YouTube's policy
Back in 2017, YouTube had introduced a policy wherein it promised to disable comments for all videos featuring minors that got inappropriate remarks. In this case, there were some videos that had comments disabled, but not all of them. Plus, they all were being promoted by the video platform through random unrelated searches, which is a problem in itself.
Also, the content is being monetized
What's even more disturbing is the fact that YouTube is monetizing the content in question. Watson discovered and demonstrated in a video that the clips being recommended by YouTube's algorithms carry adverts from brands like Disney, Lysol, and McDonald's. Notably, two years ago, YouTube had faced a backlash in a similar case where exploitative videos were being monetized.
Now, YouTube is reviewing the content and its policies
In the wake of Watson's report, YouTube issued a statement saying "any content - including comments - that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube." A spokesperson from the company told TechCrunch that YouTube is also reviewing its policies considering the content, comments flagged by the creator. They said some of the content has already been taken down.