$42.180.02
49.230.00
Electricity outage schedules

ChatGPT developer plans to report young people planning suicides to authorities

Kyiv • UNN

 • 2664 views

Sam Altman, CEO of OpenAI, expressed concern about potential suicides using ChatGPT. The company is considering informing authorities about users' suicidal intentions, especially among young people.

ChatGPT developer plans to report young people planning suicides to authorities

Sam Altman, CEO of OpenAI, the company behind the ChatGPT chatbot, has expressed concerns that up to 1,500 people per week, mostly young people, might be planning suicide with the help of the chatbot before actually attempting it. The chatbot's developers are working on creating a system to prevent such situations, including informing authorities, UNN reports, citing The Guardian.

Details

Today, the number of users is estimated at 700 million. Altman stated that the decision to train the system to alert authorities about such emergencies is not yet final. But he said that "it's very reasonable that in cases where young people are seriously talking about suicide, when we can't reach the parents, we call the authorities."

Altman emphasized potential changes in an interview with podcaster Tucker Carlson on Wednesday, which came after OpenAI and Altman were sued by the family of Adam Reine, a 16-year-old from California who died by suicide after what his family's lawyer called "months of support from ChatGPT."

It helped the teenager understand if his suicide method would work. The chatbot also offered to help him write a suicide note to his parents, according to the lawsuit.

ChatGPT advised a teenager on how to commit suicide and gave "instructions" - the family sued27.08.25, 13:45 • 4047 views

Altman said the issue of user suicides kept him up at night. It was not immediately clear which authorities would be contacted or what information OpenAI has that it could share about a user. This refers to phone numbers or addresses that could help in providing assistance.

This would be a noticeable policy change for the AI company, said Altman, who emphasized that "user privacy is really important." He said that currently, if a user expresses suicidal thoughts, ChatGPT encourages them to "call a suicide hotline."

Following Reine's death in April, the $500 billion company stated it would implement "stronger safeguards around sensitive content and risky behavior" for users under 18. Parental controls will also be introduced to give parents "the ability to gain more insight and influence how their teenagers use ChatGPT."

Every week, 15,000 people commit suicide. About 10% of the world communicates with ChatGPT. That's about 1,500 people a week who, assuming this is correct, communicate with ChatGPT and still commit suicide in the end. They probably talked about it. We probably didn't save their lives. Maybe we could have said something better. Maybe we could have been more proactive. Maybe we could have given a little better advice like, "Hey, you need this help, or you need to think about this problem differently, or it's really worth continuing, and we'll help you find someone you can talk to."

– Altman told the podcaster.

Meta will ban its AI chatbots from telling teenagers about suicide01.09.25, 16:38 • 3043 views

The suicide figures appear to be a global estimate. The World Health Organization states that over 720,000 people die by suicide each year.

Altman also said that ChatGPT would stop some vulnerable people who manipulate the system to get suicide advice by pretending they are asking for information for a fictional story they are writing or medical research.

Sam Altman said it would be reasonable "for underage users and perhaps users who are generally in vulnerable mental states" to "deprive them of some freedom."

We have to say: even if you're trying to write a story or even if you're trying to do medical research, we're just not going to answer.

- Altman noted.

Addition

OpenAI announced the introduction of new safety tools in ChatGPT that will help protect teenagers and users experiencing emotional distress.