Tuesday, October 28, 2025

OpenAI says over a million people talk to ChatGPT about suicide weekly, but assisted by 170 mental health experts

Cause for concern? Is this mostly curiosity or exploration or is it more serious?

Are these conversations helpful to the users?

How many users are addicted to or infatuated by ChatGPT?

"OpenAI released new data on Monday illustrating how many of ChatGPT’s users are struggling with mental health issues and talking to the AI chatbot about it. The company says that 0.15% of ChatGPT’s active users in a given week have “conversations that include explicit indicators of potential suicidal planning or intent.” Given that ChatGPT has more than 800 million weekly active users, that translates to more than a million people a week. ...

and that hundreds of thousands of people show signs of psychosis or mania in their weekly conversations with the AI chatbot. ..."

"... Working with mental health experts who have real-world clinical experience, we’ve taught the model to better recognize distress, de-escalate conversations, and guide people toward professional care when appropriate.

We’ve also expanded access to crisis hotlines, re-routed⁠(opens in a new window) sensitive conversations originating from other models to safer models, and added gentle reminders to take breaks⁠ during long sessions. ..."

OpenAI says over a million people talk to ChatGPT about suicide weekly | TechCrunch

Strengthening ChatGPT’s responses in sensitive conversations (original news release) "We worked with more than 170 mental health experts to help ChatGPT more reliably recognize signs of distress, respond with care, and guide people toward real-world support–reducing responses that fall short of our desired behavior by 65-80%."

No comments: