Meta Fined £310 Million: Jury Finds It Misled Users On Child Safety, Enabling Child Exploitation
New Mexico jury holds Meta accountable for child safety violations, imposing a historic £310 million fine.

A New Mexico jury has hit Meta with £310 million ($375 million) in fines after finding the company misled users about safety on its platforms and allowed children to be harmed.
The case, decided on Tuesday, is the first time a jury has held Meta responsible for what happened on its social media networks. 'The jury's verdict is a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety,' said New Mexico Attorney General Raúl Torrez, who brought the lawsuit.
The news comes after a complaint filed in December 2023, following a Guardian investigation that showed how Facebook and Instagram had been used by predators to target children. The jury found that Meta broke New Mexico's consumer protection laws, imposing the maximum penalty per violation, which adds up to £310 million ($375 million).
Evidence Shows Meta Ignored Warnings
During the trial, the jury heard internal Meta documents showing staff and outside experts repeatedly warned executives that children were at risk.
Police and the National Center for Missing and Exploited Children (NCMEC) said Meta's automated systems were ineffective, creating so many low-quality alerts that it was hard to track real crimes.
One case involved three men arrested in 2024 for trying to sexually exploit children through Meta platforms, part of an operation called 'Operation MetaPhile.' The court also heard that Meta's 2023 decision to encrypt Facebook Messenger made it harder for police to investigate crimes.
In depositions, CEO Mark Zuckerberg and Instagram boss Adam Mosseri said some harm to children, including sexual abuse and mental health issues, was unavoidable given the platforms' size. This is despite billions spent on safety updates like Instagram Teen Accounts, which protect users aged 13 to 17 by default.
Meta To Appeal, But Regulators Push Back
Meta has said it will appeal, calling the case 'sensationalist' and claiming the company has worked hard to protect children. 'We respectfully disagree with the verdict and will appeal. We remain confident in our record of protecting teens online,' a spokesperson said.
The next stage of the case, starting 4 May, will ask for extra fines and new rules for Meta platforms. These include stronger age checks, removing predators, and stopping encryption that lets offenders operate in secret.
Experts say this could change how social media is designed to keep children safe in the future.
The trial lasted nearly seven weeks, with testimony from child safety experts, Meta employees, and law enforcement.
Jury deliberations took just one day, showing how clear the evidence was. Former New Mexico deputy district attorney John W Day said the verdict was a 'huge win' and could encourage more lawsuits and stricter regulations.
More Lawsuits to Face
Beyond New Mexico, Meta is facing a wave of legal challenges across the United States.
Hundreds of families, school districts, and advocacy groups have filed cases claiming the company knowingly designed its platforms to be addictive for young users, contributing to depression, anxiety, eating disorders, and self-harm.
These lawsuits often group Meta with other tech giants such as Snap, TikTok, and YouTube.
While Snap and TikTok have reached settlements, Meta and YouTube are still fighting the claims in court, arguing they cannot be held responsible for user behaviour under federal law.
But the recent win against Meta could soon reshape social media regulation, potentially forcing companies to implement stricter safety measures and redesign features that appeal to children.
© Copyright IBTimes 2025. All rights reserved.


















