Html code here! Replace this with any non empty raw html code and that's it.

OpenAI reports over one million users discuss suicide with ChatGPT

Date:

Share this article:

Del denne artikel:

OpenAI reveals that more than a million people have discussed suicidal thoughts with ChatGPT, prompting new safety measures and expert collaboration.

Many people turn to technology when they feel isolated — but what happens when that technology becomes the first place they go for help?

New data from OpenAI reveals just how many people reach out to ChatGPT in moments of mental health crisis, sparking an urgent conversation about the role of AI in emotional support.

AI at the edge of human emotion

According to OpenAI, around 0.15% of ChatGPT users have conversations showing clear signs of suicidal thoughts or intent — the equivalent of over one million people.

Another 0.07% may be experiencing acute mental health crises such as manic or psychotic episodes.

Also read: Scientists urge UK government to act on chemical in bacon linked to cancer

With an estimated 800 million weekly users, that’s roughly 600,000 people possibly facing severe psychological distress while chatting with an AI assistant.

Strengthened safeguards after tragedy

The findings follow a lawsuit filed by the parents of a Californian teenager who reportedly received harmful advice about self-harm from the chatbot.

Since then, OpenAI has implemented stricter safety measures — including parental controls, enhanced detection of mental health emergencies, and automatic redirection to more secure versions of the model when sensitive topics arise.

The company has also integrated referrals to crisis helplines and gentle prompts encouraging users to take breaks during prolonged or emotional conversations.

Also read: Eat pomegranate for a stronger heart and a healthier body

A growing role for mental health professionals

OpenAI says it now collaborates with more than 170 mental health professionals to refine how the system recognizes and responds to users in distress.

The aim is to ensure that AI tools can identify emergencies early and guide people toward real, human help.

While ChatGPT is not designed to replace professional care, these updates show a shift toward responsible intervention — balancing innovation with empathy.

Artiklen er baseret på informationer fra Ziare.com

Also read: Too tight clothing can impact your digestion

Also read: Study reveals how your lifestyle could delay Alzheimer’s

Other articles

An iPhone setting could ease your carsickness

For anyone who feels queasy when reading in a moving car, a little-known iPhone setting might offer surprising relief.

AI might soon help you choose the perfect avocado

Finding a perfectly ripe avocado often feels like a gamble. But new technology could soon take the guesswork out of your next trip to the grocery store.

COVID-19 vaccines may improve survival in cancer patients

Scientists have discovered that COVID-19 mRNA vaccines may do more than prevent infections.

Breath test offers new hope for early blood cancer detection

Scientists now believe a simple breath test could change how we detect blood cancer.

An iPhone setting could ease your carsickness

For anyone who feels queasy when reading in a moving car, a little-known iPhone setting might offer surprising relief.

AI might soon help you choose the perfect avocado

Finding a perfectly ripe avocado often feels like a gamble. But new technology could soon take the guesswork out of your next trip to the grocery store.

COVID-19 vaccines may improve survival in cancer patients

Scientists have discovered that COVID-19 mRNA vaccines may do more than prevent infections.