TikTok influencer Arrested
TikToker Arrested After Fake Emergency Call Wastes Police and Coastguard Resources Pixabay

The digital landscape has become a modern battlefield for a group of grieving parents determined to hold social media giants accountable for the safety of their children. Ellen Roome, a British mother who lost her son, Jools Sweeney, is currently leading a high-profile legal challenge against the social media platform TikTok.

Following the conclusion of the first court hearing, Roome has stepped forward to provide a sombre update on a case that could fundamentally alter how tech companies operate in the United Kingdom. She made it clear that they wanted platforms accessible to kids to be accountable because their 'children matter.'

Court Hearing Update

Ellen Roome and three other parents, Liam Walsh, Lisa Kenevan, and Dominic, flew to the United States for a TikTok hearing. The group is suing TikTok over wrongful death after losing their children.

Following her appearance at the initial legal proceedings, Roome took to social media to communicate directly with those supporting her campaign. In a detailed Facebook post, she described the experience as an essential yet harrowing step in her quest for justice.

'Sitting in court, hearing our children's lives discussed in such a cold, legal way, is something no parent should ever have to endure,' she wrote. 'And yet, here we are, because accountability matters and because our children matter.'

The hearing marks the beginning of a formal legal examination into the platform's role regarding the content her son was exposed to before his untimely passing. Roome expressed that the process is not merely about her own loss, but about ensuring that other families are spared similar tragedies.

She noted that the legal journey would be long and arduous, yet she remains steadfast in her commitment, saying their decision to take the legal action is 'driven by love, determination, and a shared belief that what happened to our children must lead to change.'

The Legal Challenge

Roome is part of a larger collective of five families who have joined forces to take on the social media giant. These parents allege that TikTok is responsible for the wrongful deaths of their children – Jools, 14; Mia, 13; Noah, 14; Archie, 12; and Isaac, 13 – due to the platform's addictive nature and its recommendation algorithms.

The families argue that the site pushed harmful, age-inappropriate content to their children's feeds, which ultimately contributed to their deaths. They were all convinced the kids had seen the blackout challenge on the platform. The blackout challenge is an internet challenge based on a choking game and has gained widespread attention on TikTok.

The five parents want access to the data their kids were watching before their deaths. The legal team representing the parents asserts that TikTok failed in its duty of care by prioritising engagement metrics over the safety of minor users. The parents contend that the platform's design intentionally creates a rabbit hole effect, making it difficult for young users to escape dangerous content loops.

TikTok Responds to Allegations as Legal Scrutiny Intensifies

In response to escalating legal pressure and specific claims by the group of parents, TikTok has issued statements defending its safety record. The company maintains that it has robust policies in place to remove content that violates its community guidelines, particularly content related to self-harm and dangerous challenges.

It also expressed their deepest sympathies to the families involved, but consistently denied legal liability for the individual tragedies. 'Our deepest sympathies remain with these families. We strictly prohibit content that promotes or encourages dangerous behaviour,' TikTok's spokesperson said, according to Manchester Evening News.

Regarding access to the data the children were watching before their deaths, TikTok said the data were likely deleted under data privacy rules, Sky News reported. Additionally, TikTok wants the case dismissed because it was reportedly filed in the wrong place, as the harm occurred in the UK, and they were sued in the US.

The outlet added that they had already removed '99 per cent' that had been found to break these rules, even before they were reported. 'As a company, we comply with the UK's strict data protection laws,' it added. As for the access to the data the children were watching before their deaths, TikTok said that the data were likely deleted under data privacy rules, Sky News reported.