Meta CEO Mark Zuckerberg
Meta CEO Mark Zuckerberg in 2012 JD Lasica/Wikimedia Commons

Unsealed court documents from a class-action lawsuit against Meta reveal allegations that the company maintained a '17-strike' policy for accounts engaged in human trafficking before suspension. The filings, part of litigation representing over 1,800 plaintiffs, suggest a systemic pattern of prioritising user growth over the implementation of safety protocols for minors.

Internal Testimony on Enforcement

According to the lawsuit, which was unsealed Friday, 21 November, the evidence includes testimony from Vaishnavi Jayakumar, Instagram's former head of safety and well-being. When Jayakumar joined Meta in 2020, she reportedly discovered the company's lax policies regarding online safety. Specifically, she testified that Meta had a '17x strike policy for accounts that repeatedly engaged in sex trafficking'.

'You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,' said Jayakumar. "...By any measure across the industry, a very very high strike threshold.'

Jayakumar is one of several former and current Meta employees who testified as part of the proceedings. The brief was filed in the Northern District of California with over 1,800 plaintiffs, including children and their parents, school districts, and state attorneys general.

Meta Downplayed Risks and Did Not Tell Congress

The plaintiffs allege that Meta was aware of the harmful risks its platforms can have on children and engaged in a pattern of deceit to downplay these risks. The tech company was reportedly aware that millions of adult strangers were contacting minors on its platforms, contributing to worsening mental health issues in teenagers. The company was also allegedly aware of content depicting eating disorders, sexual abuse, and suicide, yet did not enforce protections nor inform Congress on the matter.

'Meta has designed social media products and platforms that it is aware are addictive to kids and they're aware that those addictions lead to a whole host of serious mental health issues,' said Previn Warren, a co-lead attorney for one of the plaintiffs.

'Like tobacco, this is a situation where there are dangerous products that were marketed towards kids,' Warren explained. 'They did it anyway, because more usage meant more profits for the company.'

Meta Sought Younger Users

The plaintiffs claim that since 2017, Meta has aggressively pursued younger users, despite internal research showing that its social media sites can potentially be addictive and harmful to children. Meta employees reportedly proposed possible safeguards to address the harmful effects, yet were repeatedly blocked by executives who were concerned about potential impacts on the company's growth.

Meta has since introduced safety features to address some of the issues raised by the plaintiffs. For example, in 2024, Meta unveiled Instagram Teen Accounts, which groups any user who falls between 13 and 18 years old into an account that is automatically private, limits showing sensitive content, turns off notifications at night, and restricts messaging from unconnected adults.

However, the plaintiffs allege that Meta resisted enforcing these safety features for years.

'My feeling then and my feeling now is that they don't meaningfully care about user safety,' said Brian Boland, former Meta vice president of partnerships who resigned in 2020. 'It's not something that they spend a lot of time on. It's not something they think about. And I really think they don't care'.