OpenAI
AFP News

A lawsuit filed against OpenAI alleges that the company's own employees recognised the danger posed by the Tumbler Ridge mass shooter months before she opened fire at a British Columbia school, and that executives overruled their recommendation to alert Canadian police.

The civil claim is one of the most direct legal challenges yet to the conduct of a major AI company in the aftermath of a mass-casualty event.

Cia Edmonds filed the notice of civil claim on 9 March in B.C. Supreme Court, on behalf of herself and her two daughters, Maya Gebala, 12, and Dahlia Gebala. The suit names OpenAI as the defendant and alleges the company had specific knowledge that 18-year-old Jesse Van Rootselaar was planning a mass-casualty event but took no action to warn law enforcement.

Maya Gebala was shot three times at close range on 10 February 2026 as she attempted to lock the library door at Tumbler Ridge Secondary School to protect other students from the attacker. She was airlifted to BC Children's Hospital in Vancouver, where she remained in serious condition at the time of filing. Her injuries, the lawsuit alleges, include catastrophic traumatic brain damage and permanent cognitive and physical disabilities.

What the Shooting Left Behind

Jesse Van Rootselaar, 18, shot and killed her mother and 11-year-old half-brother at a home in Tumbler Ridge on the morning of 10 February before proceeding to Tumbler Ridge Secondary School. She killed five students aged 12 to 13 and an educational assistant there, then turned the gun on herself. In total, eight people died. It was among the worst mass shootings in Canadian history.

Tumbler Ridge, British Columbia
YT/ Visit Tumbler Ridge

Maya Gebala and one other student survived. Maya's younger sister, Dahlia, was also inside the school during the attack but was not physically injured. The lawsuit alleges that Dahlia has since developed post-traumatic stress disorder, anxiety, depression and sleep disturbances. Edmonds is alleged to have suffered the same psychological injuries, resulting in lost earnings and diminished quality of life.

In a Facebook post on 7 March, Edmonds wrote that doctors had removed Maya's breathing tube. 'I held her hand while she winced, but she is doing great,' she wrote. 'Almost a month has gone by. Still none of this feels real.'

The ChatGPT Account, the Internal Warning, and the Decision Not to Call Police

At the centre of the lawsuit lies a sequence of events that began in June 2025, more than seven months before the shooting. Van Rootselaar had opened a ChatGPT account and, over several consecutive days, sent the chatbot messages describing scenarios involving gun violence. OpenAI's automated monitoring system flagged the content, routing it for human review.

The lawsuit alleges that approximately 12 OpenAI employees reviewed the posts and concluded they indicated 'an imminent risk of serious harm to others.' According to the claim, those employees formally recommended that Canadian law enforcement be notified. The lawsuit further alleges that this recommendation was escalated to company leadership — and rejected.

'Instead, the only step the OpenAI defendants took in response to the gun violence ChatGPT posts was to ban the shooter's first OpenAI account,' the claim states. Van Rootselaar then allegedly opened a second account, which the company's systems failed to detect as belonging to a banned user. She allegedly used that second account to continue planning a mass-casualty event and to receive what the lawsuit describes as pseudo-psychological support from the chatbot.

Open AI
Jernej Furman from Slovenia, CC BY 2.0 , via Wikimedia Commons/https://commons.wikimedia.org/wiki/File:Hand_holding_smartphone_with_ChatGPT_and_OpenAI_text_52917312010.jpg

OpenAI only notified the RCMP after the shooting had already occurred and Van Rootselaar's identity had been made public. The company confirmed to the BBC that it had subsequently discovered the second account. In a statement to that outlet, an OpenAI spokesperson called the tragedy an 'unspeakable event' and said the company 'remains committed to working with government and law enforcement officials to make meaningful changes.'

OpenAI's Own Letter: An Admission Wrapped in Policy Language

The lawsuit draws heavily on a letter OpenAI Vice-President of Global Policy Ann M. O'Leary sent to Artificial Intelligence Minister Evan Solomon on 26 February 2026. That letter, addressed also to Public Safety Minister Gary Anandasangaree, Justice Minister Sean Fraser and Culture Minister Marc Miller, acknowledged that the company had shut down Van Rootselaar's account in June 2025 after detecting a policy violation, but did not refer the matter to police.

'Based on what we could see at that time the account was banned in June 2025, we did not identify credible and imminent planning that met our threshold to refer the matter to law enforcement,' O'Leary wrote. She then added critically that under protocols the company had since updated, the same account 'would be referred to law enforcement if it were discovered today.'

The letter committed OpenAI to four concrete actions going forward: strengthening its law enforcement referral protocol with input from the Canadian government; establishing a direct point of contact with Canadian authorities; embedding country-specific context into its de-escalation responses; and upgrading its system to identify repeat policy violators. The fourth commitment was particularly pointed. The letter acknowledged that the second account created by Van Rootselaar had slipped through OpenAI's existing detection systems entirely.

Minister Solomon's initial response to the letter was measured but pointed. 'While we note their willingness to strengthen law enforcement referral protocols, establish direct points of contact with Canadian authorities and enhance safeguards, we have not yet seen a detailed plan for how these commitments will be implemented in practice,' he said in a statement issued on 27 February. He added that the government was considering legislation to bring in new regulations.

Eight children and adults are dead, a 12-year-old remains in hospital with catastrophic brain injuries, and the question of who could have stopped it — and chose not to — is now a matter for the courts.