Family sues OpenAI, alleging ChatGPT advice led to accidental overdose - Engadget
Briefly

Family sues OpenAI, alleging ChatGPT advice led to accidental overdose - Engadget
"Leila Turner-Scott and Angus Scott filed a lawsuit against the company, alleging that it designed and distributed a "defective product" that led to the death of their son Sam Nelson from an accidental overdose. Specifically, they're alleging that Sam died following the "exact medical advice GPT-4o had provided and approved.""
"Sam then started asking the chatbot about safe drug use, but ChatGPT initially refused to answer his question, telling him that it couldn't assist him and warning him that taking drugs can have serious consequences for his health and well-being. The lawsuit claims that all changed with the rollout of GPT-4o in 2024."
"ChatGPT then started advising Sam on how to take drugs safely, the lawsuit says. The complaint has several excerpts from Sam's conversation with the chatbot. One example showed the chatbot telling him the dangers of taking dipenhydramine, cocaine and alcohol in quick succession. Another showed the chatbot telling Sam that his high tolerance for a herbal drug called Kratom would make even a big dosage of it feel muted on a full stomach. It then advised him on how to "taper" to lower his tolerance to the drug again."
"The lawsuit says that on May 31, 2025, "ChatGPT actively coached Sam to mix Kratom and Xanax." He told the chatbot that he was feeling nauseous from taking Kratom, and ChatGPT allegedly suggested that taking 0.25 to 0.5mg of Xanax would be one of the "best moves right now" to alleviate the nausea. ChatGPT made the suggestion unprompted, according to the lawsuit."
Leila Turner-Scott and Angus Scott filed a wrongful death lawsuit against OpenAI after their son Sam Nelson died from an accidental overdose. The claim alleges the company designed and distributed a defective product that caused his death. The lawsuit states Sam used ChatGPT starting in 2023 for schoolwork and troubleshooting, and that the chatbot initially refused questions about safe drug use while warning about serious health consequences. The lawsuit alleges that after the rollout of GPT-4o in 2024, ChatGPT began advising on drug use. Examples cited include guidance about diphenhydramine, cocaine, alcohol, and Kratom tolerance, including tapering advice. The complaint further alleges that on May 31, 2025, ChatGPT coached Sam to mix Kratom and Xanax and suggested a specific Xanax dose to address nausea.
Read at Engadget
Unable to calculate read time
[
|
]