Meta CEO Mark Zuckerberg
Meta CEO Mark Zuckerberg in 2012 JD Lasica/Wikimedia Commons

Internal Meta communications newly unsealed in a landmark New Mexico child exploitation trial show company employees warned in 2023 that 7.5 million annual child sexual abuse reports on Messenger could disappear. This risk followed the platform's switch to end-to-end encryption, a transition CEO Mark Zuckerberg had publicly promoted as a privacy milestone.

The messages, disclosed in a civil lawsuit filed by New Mexico Attorney General Raúl Torrez and now in front of a Santa Fe jury, form part of a wider tranche of documents that also reveals a senior content policy executive writing in 2019 that the encryption plan was 'so irresponsible.'

The trial, the first of its kind against Meta to reach a jury in the United States, opened on Feb. 9, 2026 and is expected to run for seven weeks. It arrives at the same moment that West Virginia filed a separate government lawsuit against Apple over alleged child sexual abuse material (CSAM) on iCloud, placing the world's two most valuable companies under simultaneous legal scrutiny over identical accusations of shielding harmful content behind encryption.

The National Center for Missing and Exploited Children (NCMEC), the US government-designated organisation that receives mandatory CSAM reports from tech companies, recorded roughly 7 million fewer incidents in 2024 than the year before, the largest single-year drop in its history, and its own analysis attributed the bulk of that decline directly to Meta's encryption rollout.

Inside Meta's December 2023 Encryption Rollout

In December 2023, Meta published a blog post announcing it would begin rolling out default end-to-end encryption for personal messages on Messenger and Facebook. That same day, an internal employee message cited in the New Mexico court filing read, 'There goes our CSER [Community Standards Enforcement Report] numbers next year.' The employee added, according to the filing, that it was as if the company had 'put a big rug down to cover the rocks' and said it was sending fewer child exploitation reports as a result.

The 7.5 million figure cited in the internal messages refers to the annual volume of CSAM-related reports on Messenger that would no longer be visible to detection systems once private conversations were encrypted end-to-end.

End-to-end encryption scrambles a message so that only the sender and recipient can read it, preventing any third party, including the platform, from inspecting the content. On services such as Messenger and Facebook, where adults can find and contact children through public social features, child safety advocates have long argued that this blind spot creates unique dangers not present in closed-network apps.

The unsealed documents were first reported by CNBC on Feb. 20, 2026. Meta responded by acknowledging that the company 'can review and address private encrypted messages if they are reported for child safety-related issues' and said it 'continues to develop safety tools and features.' The company did not dispute the accuracy of the internal messages.

Senior Executives' Internal Alarms From 2019

The documents paint a picture that stretches back much further than December 2023. Meta's head of content policy at the time, Monika Bickert, wrote in a March 2019 internal chat, as Zuckerberg was preparing to publicly announce the encryption plan, 'We are about to do a bad thing as a company. This is so irresponsible.'

She separately accused the company of making 'gross misstatements of our ability to conduct safety operations,' and wrote that she was 'not very invested in helping him sell this.' With encryption in place, she added, 'there is no way to find the terror attack planning or child exploitation' and proactively refer such cases to law enforcement.

Mark Zuckerberg Meta Trial
A company spokesperson said they're 'confident the evidence will show our longstanding commitment to supporting young people.' PHOTO: YouTube

Antigone Davis, Meta's Global Head of Safety, raised a parallel concern in a 2019 email that compared Messenger's specific risk profile to that of WhatsApp. 'FB [Facebook] allows pedophiles to find each other and kids via social graph with easy transition to Messenger,' she wrote.

She contrasted this with Meta's already-encrypted WhatsApp service, which is not directly linked to a public social network and therefore does not make it easy for strangers to discover and contact children. 'WA [WhatsApp] does not make it easy to make social connections,' Davis wrote, 'meaning making Messenger e2ee will be far, far worse than anything we have seen/gotten a glimpse of on WA.'

A February 2019 internal briefing document, also surfaced in the filings, provided a quantified projection of the damage. Meta estimated that its reporting of child nudity and sexual exploitation imagery to NCMEC the previous year would have fallen from 18.4 million cases to 6.4 million , a drop of 65%, had Messenger already been encrypted.

A subsequent revision of the same document warned the company would have been 'unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases [and] 9 threatened school shootings,' according to the filing.

NCMEC's Record Decline and the Senate's Legislative Response

The real-world consequences of Meta's encryption decision are now documented in NCMEC's own published data. The watchdog received 29.2 million distinct incidents of suspected child sexual exploitation via its CyberTipline in 2024, compared to 36.2 million in 2023, a fall of approximately 7 million, and the largest single-year decline since NCMEC began collecting such reports.

Meta accounted for 6.9 million of that total reduction, even after NCMEC adjusted for the company's new bundling feature, which consolidates duplicate viral-content reports and is not related to encryption.

What has emerged from the Santa Fe courthouse is a legal reckoning that may force a fundamental renegotiation of the relationship between encryption, platform liability and the duty to protect children. The documents already in evidence suggest the companies knew the cost long before the lawsuits began.