Roblox
Roblox video games Facebook/Roblox

A 16-year-old girl's suicide after months of online grooming exposed the shadowy nexus of violent cyber-extremist groups like 764 and the persistent dangers facing children on user-generated platforms such as Roblox. Her death last February was the culmination of a two-year cycle of manipulation by an online predator known only as 'Culprit,' a username reportedly encountered first on the Roblox gaming platform.

Her distraught father, Jason Sokolowski, discovered that she had been coerced into self-harm, carving '764' and other identifiers linked to extremist grooming communities into her flesh before taking her own life. Sokolowski alleges that the predator groomed his daughter from afar before communicating via secondary platforms like Discord where the abuse intensified.

The 764 Network: An Online Extremist Threat

The group behind the numbers '764' is not a benign gaming clan; it is part of an internationally active sextortion and grooming network that targets minors on social media, gaming platforms and messaging apps. The Federal Bureau of Investigation (FBI) classifies 764 as a violent online group that uses psychological manipulation and threats to coerce children to produce graphic self-harm and sexually explicit material.

Founded in 2021 by Bradley Chance Cadenhead, 764 grew out of earlier online extremist circles and quickly attracted a decentralised membership. Members of the network operate primarily on chat apps such as Discord and Telegram and, to a lesser extent, in environments like Roblox where initial contact with minors occurs.

Law enforcement press releases and criminal indictments from the US Department of Justice confirm that 764 affiliates have been charged with coercion, enticement and other crimes involving minors. In November 2025, prosecutors in Maryland unsealed an indictment charging an alleged member, Erik Lee Madison, with multiple counts of sexual exploitation and coercion of children, highlighting coordinated efforts to use digital platforms to entrap vulnerable youths.

Federal public service announcements stress the threat posed by 764, noting that members seek to manipulate children into self-harm, murder and the creation of child sexual abuse material through coercive tactics once trust is established.

Grooming Mechanisms: How Predators Exploit Roblox

Roblox is designed as a creative space where users build and join games, chat and make connections with others. However, its enormous user base includes millions of underage players, with roughly half its users being under 13 years old.

According to prosecutors and FBI advisories, predators often initiate seemingly innocuous conversations in public game spaces on Roblox before shifting interactions to private messaging on Discord or Snapchat where they attempt to isolate and groom their victims. Once a child is engaged one-to-one, coercive dynamics can accelerate into demands for self-harm, explicit imagery or behavioural compliance.

Sokolowski's account of his daughter's interactions reflects this pattern. He reported discovering graphic images and messages between his daughter and the predator, culminating in self-harm escalations that ended with her suicide. He believes the groomer was linked to the 764 network.

Such cases are not isolated. Across the United States, dozens of families have filed lawsuits against Roblox and related platforms alleging similar patterns of grooming, grooming-associated suicide and coercion after initial contact was made on Roblox before moving to chat apps. These legal actions argue that Roblox failed to provide robust age verification or effective parental controls to prevent adults and extremists from accessing vulnerable children.

Platform Response and Legal Pressure

Roblox Corporation has responded to safety concerns with a suite of measures intended to restrict predator access and better protect minors. In November 2025, the company rolled out a mandatory facial age estimation process to access chat features, using encrypted biometric checks processed through a third party, with the explicit aim of limiting communication between age groups to similar brackets.

Roblox
Roblox.com

Roblox's Chief Safety Officer framed these changes as part of a broader effort to lead the industry in online safety and empower parents with more control over their children's in-platform interactions. Despite these efforts, critics argue that age verification is easily bypassed, with verified accounts sold on secondary markets and parental compliance inadvertently mislabelling children as adults, neutralising safeguards.

The identity of 'Culprit' remains unknown to law enforcement publicly, but the wider context of 764 and similar extremist grooming networks offers a troubling insight into the mechanisms that pushed a young girl to take her life – a stark reminder of the deadly risks lurking behind seemingly innocent online play.

The death of this teenager in the grip of an online predator is a devastating testament to the urgent need to dismantle grooming networks and fortify digital child protection worldwide.