Html code here! Replace this with any non empty raw html code and that's it.

OpenAI reports over one million users discuss suicide with ChatGPT

Date:

Share this article:

Del denne artikel:

OpenAI reveals that more than a million people have discussed suicidal thoughts with ChatGPT, prompting new safety measures and expert collaboration.

Many people turn to technology when they feel isolated — but what happens when that technology becomes the first place they go for help?

New data from OpenAI reveals just how many people reach out to ChatGPT in moments of mental health crisis, sparking an urgent conversation about the role of AI in emotional support.

AI at the edge of human emotion

According to OpenAI, around 0.15% of ChatGPT users have conversations showing clear signs of suicidal thoughts or intent — the equivalent of over one million people.

Another 0.07% may be experiencing acute mental health crises such as manic or psychotic episodes.

Also read: Scientists urge UK government to act on chemical in bacon linked to cancer

With an estimated 800 million weekly users, that’s roughly 600,000 people possibly facing severe psychological distress while chatting with an AI assistant.

Strengthened safeguards after tragedy

The findings follow a lawsuit filed by the parents of a Californian teenager who reportedly received harmful advice about self-harm from the chatbot.

Since then, OpenAI has implemented stricter safety measures — including parental controls, enhanced detection of mental health emergencies, and automatic redirection to more secure versions of the model when sensitive topics arise.

The company has also integrated referrals to crisis helplines and gentle prompts encouraging users to take breaks during prolonged or emotional conversations.

Also read: Eat pomegranate for a stronger heart and a healthier body

A growing role for mental health professionals

OpenAI says it now collaborates with more than 170 mental health professionals to refine how the system recognizes and responds to users in distress.

The aim is to ensure that AI tools can identify emergencies early and guide people toward real, human help.

While ChatGPT is not designed to replace professional care, these updates show a shift toward responsible intervention — balancing innovation with empathy.

Artiklen er baseret på informationer fra Ziare.com

Also read: Too tight clothing can impact your digestion

Also read: Study reveals how your lifestyle could delay Alzheimer’s

Other articles

How cinnamon can affect your body if you eat them every day

Cinnamon is more than just a pantry staple for coffee and breakfast. It has been used for centuries...

Diabetes medication may reduce risk of eye disease by 37 percent

A common medication may have an unexpected effect on a serious eye condition, new research suggests.

Vitamin D may help prevent Long COVID, study suggests

Long COVID continues to affect many people, and researchers are still searching for explanations.

How cinnamon can affect your body if you eat them every day

Cinnamon is more than just a pantry staple for coffee and breakfast. It has been used for centuries...

Diabetes medication may reduce risk of eye disease by 37 percent

A common medication may have an unexpected effect on a serious eye condition, new research suggests.