US states began investigating Meta some two years ago after a Facebook whistleblower leaked a trove of documents and accused the social media titan of putting profit over safety on its platforms
One specific instance highlighted in the legal action refers to an internal email thread where Meta employees deliberated on the non-deletion of four accounts belonging to a 12-year-old girl. AFP News

Court documents have emerged alleging that Meta, the parent company of social media giant Facebook, intentionally designed its platforms to induce addiction, particularly among children.

The revelations, outlined in legal filings, have ignited widespread concern and renewed calls for increased scrutiny of the tech industry's practices, especially when it comes to safeguarding the well-being of young users.

The court documents, which were made public as part of an ongoing legal case against Meta, assert that the company employed sophisticated algorithms and features to exploit psychological vulnerabilities in children, effectively creating an environment conducive to addiction.

The allegations have brought the spotlight back on the ethical responsibilities of tech companies and the potential consequences of prioritising engagement metrics over user well-being.

The complaint is a pivotal component of a legal action initiated against Meta by the attorneys general of 33 states in late October.

Initially redacted, the complaint contends that the social media giant was aware, yet never disclosed the fact that they had received millions of complaints about underage users on Instagram.

Disturbingly, the company purportedly disabled only a fraction of the reported accounts. The lawsuit contends that the prevalence of underage users was an "open secret" within the company, referencing internal documents as evidence.

One specific instance highlighted in the legal action refers to an internal email thread where Meta employees deliberated on the non-deletion of four accounts belonging to a 12-year-old girl.

The accounts had been flagged by the girl's mother, who explicitly stated her daughter's age and requested their removal. Still, the employees concluded that "the accounts were ignored" partly because Meta representatives "couldn't definitively confirm the user's underage status".

According to the complaint, Meta received a staggering 402,000 reports of Instagram users under the age of 13 in 2021. However, a mere 164,000, significantly less than half of the reported accounts, were "disabled for potentially being under the age of 13" during that year.

Notably, the complaint highlights that Meta, on occasion, contends with a backlog of up to 2.5 million accounts of younger children awaiting action.

The legal filing asserts that these occurrences, among others, contravene the Children's Online Privacy and Protection Act.

This federal law mandates that social media companies must provide notice and obtain parental consent before collecting data from children.

Furthermore, the lawsuit delves into persistent claims that Meta deliberately crafted products designed to be addictive and detrimental to children, a spotlight heightened by whistle-blower Frances Haugen.

Haugen disclosed internal studies indicating that platforms like Instagram exposed children to content related to anorexia. She also alleged that the company intentionally targets individuals under the age of 18.

Child safety advocates and concerned parents have long raised alarms about the impact of social media on the mental health of young users.

The court documents provide a more detailed and damning account of the intentional design choices made by Meta to capitalise on children's vulnerability to addictive behaviours.

Critics argue that these documents underscore the need for greater transparency and accountability within the tech industry, especially concerning the impact of their products on the well-being of vulnerable user groups.

The allegations against Meta come at a time when governments and regulatory bodies worldwide are increasingly scrutinising the practices of major tech companies.

In the United Kingdom, discussions about digital regulation and online safety have gained momentum, with policymakers exploring ways to protect users, particularly children, from potential harm arising from online platforms.

Child protection organisations have seized upon the revelations to emphasise the urgency of implementing robust regulations to curb the potential negative impacts of social media on young users.

The allegations have reignited the debate over the age-appropriateness of certain platform features and the need for comprehensive measures to ensure the responsible use of technology, especially among vulnerable demographics.

Meta, in response to the allegations, issued a statement vehemently denying any intentional design to foster addiction among users, especially children.

The company argued that it has invested significant resources in developing tools and features aimed at promoting user well-being and controlling screen time.

As the legal battle unfolds, the allegations against Meta highlight the pressing need for a nuanced and comprehensive approach to digital regulation.

Balancing innovation with user protection, particularly for vulnerable groups like children, remains a complex challenge that policymakers, tech companies, and society at large must navigate collaboratively to ensure a safer and more responsible digital future.