LOADING...

Meta accused of hiding child safety lapses on Instagram, Facebook

Technology

Meta, the company behind Instagram and Facebook, is under fire for allegedly hiding research about serious risks to kids and teens on its platforms.
Reports say Meta knew about many underage users and issues like adult-minor interactions and teen mental health struggles but waited years to act.

Repeated violations overlooked

A former Meta vice president shared, "They don't meaningfully care about user safety."
Court documents show Meta's "17x strike" policy let accounts break the rules up to 16 times—including for sex trafficking—before being suspended.
Internal checks also found millions of adults contacting minors on its platforms, while harmful content around eating disorders and child sexual abuse was rarely removed.

Insider testimony highlights resistance to change

Vaishnavi Jayakumar, former Instagram head of safety, testified that this strike policy enabled repeated violations.
The brief indicated that reporting tools for child abuse content were often ineffective, and former executives noted that Meta resisted making real safety changes even when the harm was clear.

Critics question Meta's promises

Meta says it takes user safety seriously and is always working on improvements.
But critics argue these words ring hollow given how slow the company has been to actually protect young users.