27 Dec 2019
To keep children safe, YouTube once considered screening videos manually
Despite being incredibly successful, YouTube hasn't been able to dodge controversies.
The platform has been marred by several issues, especially those revolving around poor content moderation and exposing kids to inappropriate videos.
Here's all you need to know about it.
First, a bit on YouTube's main problem
On multiple instances earlier this year and last year, YouTube videos purported to be for kids, like those of games or cartoon characters like Mickey Mouse, were caught featuring inappropriate content such as school shooting, human-trafficking, or suicide/self-harm.
The revelation triggered a wave of criticism from parents and privacy advocates, leading to questions over YouTube's efforts to build a safer community for underage users.
Major steps taken to boost kids' privacy
In the wake of the FTC fine, YouTube implemented a number of changes to boost privacy and protect kids using both its main app as well as YouTube Kids - the kids-dedicated version of the platform.
Notably, among these changes, the company also made it mandatory for creators to highlight if their video is "made for kids" or not (which directly affects their ad-revenue).
But, before that, YouTube considered curating videos manually
YouTube's move puts the onus of responsible sharing on the creators. But, prior to this, the company had considered taking the matter in its own hands, Bloomberg reported.
Essentially, YouTube had planned a project, dubbed Crosswalk, to manually screen/check every single video directed to the feed of YouTube Kids.
The move, the company hoped, would prevent troublesome content from reaching kids under 8.
Plan came very close to execution
As per Bloomberg, which spoke to multiple people familiar with project Crosswalk (a way to guide kids across YouTube's chaotic streets), YouTube was extremely close to begin manual screening of kids' videos.
It had assembled a team of 40 people, even drafted a press release for the announcement, but then, the company's CEO Susan Wojcicki and her aides decided to drop the plan.
Why the plan was ditched?
YouTube took the decision of ditching the screen plan to maintain neutrality, one of Bloomberg's sources claimed.
Essentially, the problem was, if the platform would handpick videos, even if those are for kids, it would look more like a media company than a neutral content hosting platform.
However, a YouTube spokesperson denied that the company ditched the program due to this.
Still, YouTube's effort has shown some results
While YouTube may not be willing to screen kids' videos to rule out troublesome content, its other steps, including the idea of disabling comments on clips, have proven fruitful to some extent.
The company claims to have reduced the views of clips violating its policies by as much as 80% and increased the viewership of clips from "authoritative news publishers" by a significant 60%.
Then, YouTube was fined for violating children's privacy
After the inappropriate content issues, YouTube was fined $170 million by the FTC for violating the US Children's Online Privacy Protection Act. The company was accused of collecting personal information from the viewers of channels targeted at kids under 13 without explicit disclosure/parental consent.
Wojcicki's recent comments also echoed the push for neutrality
"If we were held liable for every single piece of content we recommended, we would've to review it," Wojcicki recently told CBS News. "That [given the scale/type of content] would mean there would be a much smaller set of information that people would be finding."