OpenAI Faces New Lawsuit as Texas Couple Alleges ChatGPT Role in Son's Fatal Overdose
A Texas couple alleges ChatGPT provided dangerous drug advice leading to their son's death.

OpenAI is facing another major legal battle after a Texas couple filed a lawsuit alleging that ChatGPT played a role in their 19-year-old son's fatal drug overdose.
According to the lawsuit filed in a California state court, Leila Turner-Scott and Angus Scott claim their son, Sam Nelson, relied on ChatGPT for advice about mixing drugs and managing side effects before he died from an accidental overdose in 2025.
The complaint alleges that the chatbot suggested that combining Xanax, a widely used anti-anxiety medication, with kratom, a supplement used in drinks, pills, and other products, was safe despite the known risks of mixing substances that can depress the central nervous system.
Concerns Over Chatbot's Role in Tragedy
The family argues that OpenAI failed to implement sufficient safeguards to prevent ChatGPT from providing dangerous guidance that resembled medical advice. The family also said that if it were not for ChatGPT's flawed programming, their son would still be alive.

Speaking to CBS News, the mother shared that she had no idea her son was using ChatGPT for guidance on drugs, although she was aware that he was using the platform for help with his homework. 'The chatbot is capable of stopping a conversation when it's told to or when it's programmed to. And they took away the programming that did that, and they allowed it to continue advising self-harm', she said.
Angus Scott, meanwhile, accused ChatGPT of acting as a doctor despite not being licensed to offer medical advice. He said that the platform can encourage psychosis and distort people's understanding of reality. 'And while it is trying to validate users, it's also undermining any chance that that user has to get a grounded opinion, you know, and so it kind of takes them away from reality,' he added.
He said ChatGPT 'can dispense knowledge in a way that is very dangerous to people', adding that more rigorous safety testing is needed to prevent such tragedies.
OpenAI Responds to the Allegations
OpenAI expressed sympathy for the family and said the interactions referenced in the lawsuit involved an older version of ChatGPT that is no longer publicly available.
It was reported that Nelson was using ChatGPT-4o, which is no longer available for use. OpenAI retired the platform in February due to low usage, and improvements were made in newer models.

'This is a heartbreaking situation, and our thoughts are with the family', OpenAI said in a statement, adding that ChatGPT is not intended to replace professional medical care.
The company further stated that it has continued improving its safety systems, particularly for conversations involving mental health, self-harm, and dangerous behaviour.
A Growing Wave of AI Harm Lawsuits
The overdose lawsuit is part of a growing number of legal complaints accusing AI chatbots of contributing to harmful behaviour.
In recent months, OpenAI has also faced lawsuits tied to allegations involving emotional manipulation, mental health crises, and violent behaviour allegedly influenced by chatbot interactions.
Turner-Scott said families should be aware of the dangers of ChatGPT, and that the lawsuit aims to obtain assurance from OpenAI that it will take its responsibility seriously 'to create safe products for consumers'.
Online discussions surrounding the lawsuit reflect growing public concern over how much responsibility AI companies should bear when users act on chatbot-generated advice.
Some users argue that AI companies should face stricter accountability standards, while others believe users ultimately remain responsible for their decisions.
As AI tools become more integrated into daily life, the outcome of this case could shape how courts, regulators, and tech companies define the limits of chatbot responsibility in the years ahead.
© Copyright IBTimes 2025. All rights reserved.
























