Html code here! Replace this with any non empty raw html code and that's it.

OpenAI reports over one million users discuss suicide with ChatGPT

Date:

Share this article:

Del denne artikel:

OpenAI reveals that more than a million people have discussed suicidal thoughts with ChatGPT, prompting new safety measures and expert collaboration.

Many people turn to technology when they feel isolated — but what happens when that technology becomes the first place they go for help?

New data from OpenAI reveals just how many people reach out to ChatGPT in moments of mental health crisis, sparking an urgent conversation about the role of AI in emotional support.

AI at the edge of human emotion

According to OpenAI, around 0.15% of ChatGPT users have conversations showing clear signs of suicidal thoughts or intent — the equivalent of over one million people.

Another 0.07% may be experiencing acute mental health crises such as manic or psychotic episodes.

Also read: Scientists urge UK government to act on chemical in bacon linked to cancer

With an estimated 800 million weekly users, that’s roughly 600,000 people possibly facing severe psychological distress while chatting with an AI assistant.

Strengthened safeguards after tragedy

The findings follow a lawsuit filed by the parents of a Californian teenager who reportedly received harmful advice about self-harm from the chatbot.

Since then, OpenAI has implemented stricter safety measures — including parental controls, enhanced detection of mental health emergencies, and automatic redirection to more secure versions of the model when sensitive topics arise.

The company has also integrated referrals to crisis helplines and gentle prompts encouraging users to take breaks during prolonged or emotional conversations.

Also read: Eat pomegranate for a stronger heart and a healthier body

A growing role for mental health professionals

OpenAI says it now collaborates with more than 170 mental health professionals to refine how the system recognizes and responds to users in distress.

The aim is to ensure that AI tools can identify emergencies early and guide people toward real, human help.

While ChatGPT is not designed to replace professional care, these updates show a shift toward responsible intervention — balancing innovation with empathy.

Artiklen er baseret på informationer fra Ziare.com

Also read: Too tight clothing can impact your digestion

Also read: Study reveals how your lifestyle could delay Alzheimer’s

Other articles

Sour Halloween candy can damage your teeth

Every Halloween, kids and adults alike dive into bags of colorful candy. But among the chocolate bars and lollipops, one type of sweet stands out.

Nutrition expert ate chia seeds for a week – here’s what happened

A nutrition expert decided to test whether chia seeds truly lived up to the hype.

Evening exercises that prepare your body for rest

The right kind of exercise in the evening can calm both body and mind, setting the stage for more restful sleep.

Scientists urge UK government to act on chemical in bacon linked to cancer

For many, the smell of sizzling bacon is an essential part of weekend mornings — a comfort food...

Sour Halloween candy can damage your teeth

Every Halloween, kids and adults alike dive into bags of colorful candy. But among the chocolate bars and lollipops, one type of sweet stands out.

Nutrition expert ate chia seeds for a week – here’s what happened

A nutrition expert decided to test whether chia seeds truly lived up to the hype.

Evening exercises that prepare your body for rest

The right kind of exercise in the evening can calm both body and mind, setting the stage for more restful sleep.