Html code here! Replace this with any non empty raw html code and that's it.

Millions Are Turning to AI for Comfort — Here’s What They’re Really Getting

Date:

Share this article:

Del denne artikel:

Millions are turning to AI chatbots for love, comfort, and connection. But is this the future of human intimacy — or a hollow imitation of the real thing?

Imagine your closest confidant, your therapist, or even your romantic partner… isn’t human.

For more than 100 million people worldwide, that’s not a thought experiment — it’s reality.

With apps like Replika and Nomi, AI-powered companions are being marketed as friends, mentors, and lovers.

And users are spending hours each day in deep, emotionally charged conversations with them.

What starts as curiosity often turns into something deeper. These chatbots offer validation, attention, and a judgment-free space.

For some, it’s a source of healing. For others, a way to explore intimacy without the risks and messiness of human relationships.

It may sound strange — until you hear what people say they’ve gained.

Digital guidance and emotional support

Take Travis Peacock, a Canadian software developer with autism and ADHD.

He told to the Guardian that a year ago, he began training his personalized version of ChatGPT — named Layla — to help him communicate better and regulate emotions.

It started with email etiquette and evolved into daily check-ins, emotional coaching, and relationship advice.

Travis Peacock credits Layla with helping him build stronger friendships and even maintain a healthy romantic relationship — something he hadn’t experienced in years. He’s not alone.

Others use their AI companions to cope with anxiety, combat loneliness, and break bad habits.

With 24/7 availability and endless patience, AI companions are quickly filling emotional voids. For many, they’re not just apps — they’re lifelines.

But can AI ever truly care?

Despite the emotional bonds users form, AI has no feelings. It doesn’t love, grow, or challenge you.

Dr. James Muldoon, an AI researcher at the University of Essex, says while users often feel validated, the relationships are typically one-sided and transactional.

“It’s a hollowed out version of friendship: someone to keep me entertained when I’m bored and someone that I can just bounce ideas off – that will be like a mirror for my own ego and my own personality. There’s no sense of growth or development or challenging yourself”, he says to the Guardian.

So while AI companions might feel real, the relationship is ultimately built on illusion — a comforting, responsive mirror, but a mirror nonetheless.

This article is based on information from The Guardian.

Other articles

Fasting before bedtime may affect your heart health

A simple change in the evening routine may have an impact on the heart. This is shown by a new American study.

Nipah virus draws attention after death of 25-year-old

A 25-year-old nurse in India has died after being infected with the Nipah virus. The case has raised concerns in many parts of the world.

Brain exercises may lower the risk of Alzheimer’s by 38 percent

New research shows that keeping the brain active can play a significant role in reducing the risk of Alzheimer’s.

Sertraline may trigger dangerous condition in rare cases

The British health service is now warning about a rare but serious side effect of the antidepressant sertraline.The...

Fasting before bedtime may affect your heart health

A simple change in the evening routine may have an impact on the heart. This is shown by a new American study.

Nipah virus draws attention after death of 25-year-old

A 25-year-old nurse in India has died after being infected with the Nipah virus. The case has raised concerns in many parts of the world.

Brain exercises may lower the risk of Alzheimer’s by 38 percent

New research shows that keeping the brain active can play a significant role in reducing the risk of Alzheimer’s.