Html code here! Replace this with any non empty raw html code and that's it.

OpenAI reports over one million users discuss suicide with ChatGPT

Date:

Share this article:

Del denne artikel:

OpenAI reveals that more than a million people have discussed suicidal thoughts with ChatGPT, prompting new safety measures and expert collaboration.

Many people turn to technology when they feel isolated — but what happens when that technology becomes the first place they go for help?

New data from OpenAI reveals just how many people reach out to ChatGPT in moments of mental health crisis, sparking an urgent conversation about the role of AI in emotional support.

AI at the edge of human emotion

According to OpenAI, around 0.15% of ChatGPT users have conversations showing clear signs of suicidal thoughts or intent — the equivalent of over one million people.

Another 0.07% may be experiencing acute mental health crises such as manic or psychotic episodes.

Also read: Scientists urge UK government to act on chemical in bacon linked to cancer

With an estimated 800 million weekly users, that’s roughly 600,000 people possibly facing severe psychological distress while chatting with an AI assistant.

Strengthened safeguards after tragedy

The findings follow a lawsuit filed by the parents of a Californian teenager who reportedly received harmful advice about self-harm from the chatbot.

Since then, OpenAI has implemented stricter safety measures — including parental controls, enhanced detection of mental health emergencies, and automatic redirection to more secure versions of the model when sensitive topics arise.

The company has also integrated referrals to crisis helplines and gentle prompts encouraging users to take breaks during prolonged or emotional conversations.

Also read: Eat pomegranate for a stronger heart and a healthier body

A growing role for mental health professionals

OpenAI says it now collaborates with more than 170 mental health professionals to refine how the system recognizes and responds to users in distress.

The aim is to ensure that AI tools can identify emergencies early and guide people toward real, human help.

While ChatGPT is not designed to replace professional care, these updates show a shift toward responsible intervention — balancing innovation with empathy.

Artiklen er baseret på informationer fra Ziare.com

Also read: Too tight clothing can impact your digestion

Also read: Study reveals how your lifestyle could delay Alzheimer’s

Other articles

Study examines how artificial sweeteners affect weight and gut health

A new long-term study finds that artificial sweeteners have only modest effects on weight and gut health.

Scientists explore how vitamin D may shape long-term health

New research hints that vitamin D could influence far more of your long-term health than previously believed.

An overview of early-stage pancreatic cancer symptoms

A number of subtle symptoms may signal pancreatic cancer long before the disease is diagnosed.

What researchers found may finally help diabetics recover more naturally

A new scientific insight suggests diabetes damage might be easier to slow than anyone expected.

Study examines how artificial sweeteners affect weight and gut health

A new long-term study finds that artificial sweeteners have only modest effects on weight and gut health.

Scientists explore how vitamin D may shape long-term health

New research hints that vitamin D could influence far more of your long-term health than previously believed.

An overview of early-stage pancreatic cancer symptoms

A number of subtle symptoms may signal pancreatic cancer long before the disease is diagnosed.