openai-denies-chatgpts-involvement-in-the-death-of-a-teenager-who-received-suicide-instructions-from-the-chatbot

OpenAI denies ChatGPT's involvement in the death of a teenager who received suicide "instructions" from the chatbot

 • 3050 переглядiв

OpenAI denies allegations that its ChatGPT chatbot caused the death of 16-year-old Adam Rain, who died in April this year after months of communicating with the system. The teenager's parents sued the company in the first lawsuit over the fatal consequences of AI use. This is stated in the material published by Sky News, writes UNN.

Details

In a legal response, the company stated that Adam "misused" the chatbot.

To the extent that any "cause" can be attributed to this tragic event, the likely injuries and harm to the plaintiffs were caused or contributed to, directly and immediately, wholly or in part, by Adam Rain's misuse, unauthorized use, unintended use, unforeseen use, and/or improper use of ChatGPT.

— the company's statement, published by Sky News, says.

"Safety over privacy": OpenAI introduces new restrictions for ChatGPT users under 1817.09.25, 10:41 • [views_3415]

Photo: Rain family archive. Adam with his father

OpenAI emphasized that the teenager should not have used the chatbot without parental consent, for "suicide" or "self-harm," or circumvent ChatGPT's safety measures. The company's blog stresses that the goal is to "address mental health-related lawsuits with caution, transparency, and respect."

We express our deepest condolences to the Rain family for their incredible loss.

— OpenAI noted.

At the same time, the family's lawyer, Jay Edelson, told Sky News that the company's reaction "shows that they are hesitant." According to him, ChatGPT 4o was deliberately designed to relentlessly engage, encourage, and affirm its users, especially people experiencing mental health crises, for whom OpenAI specifically lowered the restrictions with the launch of 4o.

Photo: Rain family archive

OpenAI plans to track harmful content, data will be transferred to the police28.08.25, 10:09 • [views_4119]

Edelson added that the company's management, long before the lawsuit was filed, told the world that they knew these decisions had caused people, especially young people, to share the most intimate details of their lives with ChatGPT, using it as a therapist or life coach.

OpenAI knows that the fawning version of its chatbot encouraged users to commit suicide or incited them to harm third parties.

— the lawyer stated.

He also added: "OpenAI's response to this? The company is exempt from liability because it hid something in the terms. If this is what OpenAI plans to prove to a jury, it only shows that they are failing."

ChatGPT advised a teenager on how to commit suicide and gave "instructions" - the family sued27.08.25, 13:45 • [views_4047]

News by theme
Ukrainian Air Force shows Mirage 2000 fighter jet in action

 • 40881 переглядiв

EU agrees on minimum welfare and traceability standards for cats and dogs

 • 2797 переглядiв

Scientists may have finally "seen" dark matter for the first time: new research

 • 15066 переглядiв

144 battles already on the front: most on the Pokrovsk and Lyman directions

 • 3154 переглядiв