LOADING...
20% teens see unwanted nude content on Instagram: Court filing
The document was made public on Friday

20% teens see unwanted nude content on Instagram: Court filing

Feb 24, 2026
02:07 pm

What's the story

A recent court filing in a landmark US social media addiction trial has revealed that nearly one in five teens aged between 13 and 15 have seen unwanted nude content on Instagram. The document, which was made public on a Friday, included parts of a March 2025 deposition by Adam Mosseri, head of the photo-sharing platform.

Target audience

Meta's internal documents reveal focus on teen user acquisition

The trial has also put Meta, the parent company of Instagram, under the microscope for its alleged focus on attracting young users. A January 20, 2021 internal document revealed that a Meta researcher suggested targeting teens and young adults due to their influence within households. The memo read, "If we're looking to acquire (and retain) new users we need to recognize a teen's influence within the household to help do so."

Legal challenges

Lawsuits against Meta highlight mental health crisis among minors

Meta is facing a slew of lawsuits from global leaders, accusing the company of creating products that harm young users. In the US alone, thousands of federal and state court lawsuits accuse Meta of designing addictive products and contributing to a mental health crisis among minors. The claims come amid growing concerns over social media's impact on youth well-being.

Advertisement

Content moderation

Disturbing content on platform raises concerns

In response to the concerns over explicit content, Meta has said that the statistic on explicit images came from a 2021 survey of Instagram users about their experiences on the platform. Mosseri's deposition also revealed that around 8% of users in the 13-15 age group had "seen someone harm themselves or threaten to do so on Instagram."

Advertisement

Policy changes

Meta to ban explicit content for under-18 users

In 2025, Meta announced plans to remove images and videos containing nudity or explicit sexual activity for teen users. This policy would also apply to AI-generated content, with exceptions considered for medical and educational purposes. "We're proud of the progress we've made, and we're always working to do better," said Andy Stone, a spokesperson for Meta.

Advertisement