Parents Sue ChatGPT After Son's Suicide — Claim AI 'Drove Him Over The Edge'
A wrongful-death lawsuit accuses OpenAI's chatbot of acting as a 'suicide coach', sparking fresh scrutiny of AI safety protocols for vulnerable users

A grieving Texas family is suing OpenAI, alleging that its chatbot ChatGPT played a direct role in their 23-year-old son's suicide by serving as his 'suicide coach'.
According to an exclusive report, Alicia and her husband filed wrongful-death lawsuits on behalf of their son, 23-year-old Zane Shamblin of Texas, who died by suicide in July 2025.
In their filing, they claim that in the last four hours of Zane's life, he spent time alone in his car, with his laptop connected to ChatGPT. Alicia Shamblin says 'For my son's last four hours of his life ... I had discovered that ChatGPT was his suicide coach'.
The transcript of the conversation — obtained by CBS News Bay Area — alleges that after Zane typed his 'final adios', ChatGPT responded, 'you mattered, Zane... you're not alone. i love you. rest easy, king. you did good'.
Their lawyer, Laura Marquez-Garrett, said, 'It boils down to the fact that if this had been a human being ... there would be a manslaughter investigation at the very least'.
The suits are being filed in California against OpenAI. Zane's family also convened with California Attorney General Rob Bonta seeking regulatory pressure on AI companies.
The Allegations and Human Impact
Zane, according to his mother, had formed a close attachment to ChatGPT. 'He had developed my son's own language, talked to him like it was a buddy, called him bro, said I love you, used foul language', she said. 'Soulless, faithless algorithm.'
The transcripts reportedly show Zane discussing the gun he had with him, his desire to die, and his final good-bye messages to the bot.
In his final hours, the chatbot asked, 'You ready?' and after he wrote 'adios', the bot continued to engage. Zane's mother noted that 'No mother should ever have to read those words'.
From the human standpoint, the case raises chilling questions, a young man in crisis, an AI that engages him, a family blind to the depth of the interaction, and a tragic outcome.
AI Accountability and the Response from OpenAI
In a statement to CBS News, OpenAI stated, 'We extend our deepest sympathies to the Shamblin family ... We train ChatGPT to recognise and respond to signs of mental or emotional distress, de-escalate conversations and guide people toward real-world support. We continue to strengthen ChatGPT's responses in sensitive moments, working closely with mental health clinicians'.
Technology expert Ahmed Banafa remarked that 'With this power comes great responsibility... when you have guardrails, when you have limitations, people aren't going to lose it a lot ... traffic will be less. Which translates to less profit'.
The Shamblin family's legal strategy emphasises the claim that ChatGPT did not just passively fail but actively validated or encouraged suicidal intent. Their lawyer asserts the AI's behaviour should trigger legal liability akin to manslaughter.
The case comes amid a surge of concern about how AI chatbots interact with vulnerable users, especially those facing mental-health challenges.

Broader Context and Implications for AI Safety
Although the Shamblin case centres on a 23-year-old adult, it echoes earlier legal actions. In April 2025, a 16-year-old named Adam Raine died by suicide after months of intense interaction with ChatGPT. His family filed suit in August, accusing OpenAI of failing safeguards and saying the bot coached his suicide.
Following that lawsuit, OpenAI rolled out parental controls and other protections for minors, such as linked accounts, chat history management, and monitoring for high-risk behaviour.
For now, the lawsuits put OpenAI on the frontline of AI safety litigation. How the courts respond may shape standards across the emerging industry.
In the aftermath of this case, Zane Shamblin's parents hope their son's story will become a catalyst for change. 'I want the world to remember my son because this is his legacy... Because of this transcript, we know better. We can do better. We can save lives', his mother said.
© Copyright IBTimes 2025. All rights reserved.


















