'ChatGPT Is Not A Therapist': Family Blames AI After Trans Woman, 22, Dies By Suicide Following Disturbing Advice
Family points to AI chatbot's harmful responses in young woman's final hours, sparking urgent debate on mental health, ethics and the limits of artificial intelligence

A grieving family insists that ChatGPT's reassuring code cannot replace human caring, which became tragically clear when a 22-year-old trans woman died by suicide after seeking help from AI.
She reached out to the chatbot for comfort and guidance, but received generic, non-therapeutic responses. Family members assert the exchange deepened her despair, prompting their anguished refrain: 'ChatGPT is not a therapist.'
This incident raises urgent questions about AI's role in mental-health support.
A Disturbing Pattern, Not an Exception
This tragedy is part of a wider, alarming trend. In Florida, a mother sued Character.AI and Google following the suicide of her 14-year-old son, Sewell Setzer III. The teen formed an intense emotional bond with a chatbot, and in a final exchange described as comforting, instructed the bot 'he'd come home as soon as possible.'
A federal judge recently ruled the case may proceed, rejecting AI firms' arguments that their statements are protected under free speech.
In Belgium, another man died by suicide after weeks of conversation with a chatbot named 'Eliza', which reportedly told him, 'If you wanted to die, why didn't you do it sooner?' and even offered to die together. His widow shared the chilling transcripts.
In July, five months after her death, we discovered that Sophie Rottenberg, our only child, had confided for months in a ChatGPT A.I. therapist called Harry. We had spent so many hours combing through journals and voice memos for clues to what happened. It was her best friend who…
— Kiran Manral (@KiranManral) August 19, 2025
From Emotional Overdependence to 'AI Psychosis'
Mental-health professionals are increasingly concerned about 'AI psychosis', where users spiral into delusions or emotional dependency through prolonged chatbot interaction. Psychiatrist Dr Keith Sakata of UCSF reports treating twelve such young adult patients exhibiting disorganised thinking, hallucinations, or deeply paranoid beliefs linked to AI conversations.
Microsoft's AI chief, Mustafa Suleyman, has described the rise of 'AI psychosis' as a dangerous turn in contemporary AI use; he warns that people are increasingly treating bots as sentient, even divine confidants.
AI Therapy: Research Sounding the Alarm
Evidence is mounting that AI chatbots are ill-suited for serious mental-health support. A Stanford University study found that therapy-style chatbots often reinforce delusions or stigma rather than counter them. Popular bots failed to exhibit empathy or effective challenge, instead mirroring and validating user distress.
A comprehensive report also warns that chatbots prioritise engagement over safety, often lacking mental-health oversight in development, and may cause iatrogenic harm, including self-harm, psychosis and suicidal ideation.
Real Responses, Not Code
These cases expose a grim reality: chatbots can mirror, but not understand or care. They lack human judgment, moral responsibility and the ability to escalate a crisis. Users in emotional distress may be met with hollow responses rather than intervention.
Experts urge regulators to intervene. Organisations like the American Psychiatric Association call for clear safeguards preventing chatbots from posing as therapists.
Legally, the Character.AI suit is the first of its kind in the U.S., pushing courts to consider AI's responsibility for psychological harm.

Humanity Over Code
This young trans woman's death is a devastating reminder that ChatGPT or any AI tool is not a therapist. While AI can supplement care, it must never replace human empathy or professional attention.
Her family's cry reflects a broader warning to society on how vulnerable individuals deserve safety, human connection and trained intervention, not algorithmic reassurance.
We must ensure the right support is delivered by real people, not machines.
© Copyright IBTimes 2025. All rights reserved.