Zuckerberg says crime on Facebook 'inevitable' in teen safety trial
What's the story
Meta CEO Mark Zuckerberg has admitted that criminal behavior on Facebook is "inevitable" due to the platform's massive user base. In a deposition played at a trial in New Mexico, he said that if you're serving billions of people, the unfortunate reality is that some very small percent of them are going to be criminals. The statement comes as part of the defense against allegations that Meta prioritizes profit and user engagement over child safety.
Legal battle
Trial details and allegations against Meta
The trial pits Meta against New Mexico's attorney general Raul Torrez, who claims that Meta has knowingly allowed predators to exploit children on its platforms. The lawsuit alleges that the company puts profit and user engagement above child safety. In response, Meta has pointed to changes it made in 2024, including teen accounts with default protections.
Company defense
Meta defends itself amid allegations
A Meta spokesperson defended the company, saying that it has strict, longstanding rules against child exploitation and has invested billions to fight it. They highlighted the use of proactive detection technology and safety features to prevent harm. The spokesperson also emphasized Meta's commitment to transparency by regularly sharing data on content removal, while acknowledging that no system can be perfect.
Trial evidence
Disturbing statistics presented during trial
During the trial, jurors watched recorded depositions of Zuckerberg and Instagram head Adam Mosseri from last year. They also heard that family members of Meta employees had been sexually solicited on Instagram. Prosecutors presented evidence that in 2020, Meta estimated 500,000 children were receiving sexually inappropriate communications on Instagram daily. A company spokesperson said the technology used at the time was too broad and conservative, leading to non-inappropriate interactions being counted as well.
Policy review
'People you may know' algorithm identified as major contributor
The "People you may know" algorithm, which recommends accounts for users to connect with, was identified as a major contributor to these interactions. In 2018, it was used to discover victims in 79% of identified cases. The court also heard that Zuckerberg approved end-to-end encryption for Facebook Messenger in 2023 despite warnings from child safety groups Thorn and the National Center for Missing and Exploited Children (NCMEC).
Encryption impact
End-to-end encryption concerns raised by child safety groups
Zuckerberg defended the encryption, saying it was a more pressing issue than potential risks to children. He said, "I think that end-to-end encryption messaging services are what people want. They really care about privacy." However, child safety groups and law enforcement have warned that this could enable predators to share child sexual abuse imagery without detection.