Html code here! Replace this with any non empty raw html code and that's it.

Millions Are Turning to AI for Comfort — Here’s What They’re Really Getting

Date:

Share this article:

Del denne artikel:

Millions are turning to AI chatbots for love, comfort, and connection. But is this the future of human intimacy — or a hollow imitation of the real thing?

Imagine your closest confidant, your therapist, or even your romantic partner… isn’t human.

For more than 100 million people worldwide, that’s not a thought experiment — it’s reality.

With apps like Replika and Nomi, AI-powered companions are being marketed as friends, mentors, and lovers.

And users are spending hours each day in deep, emotionally charged conversations with them.

What starts as curiosity often turns into something deeper. These chatbots offer validation, attention, and a judgment-free space.

For some, it’s a source of healing. For others, a way to explore intimacy without the risks and messiness of human relationships.

It may sound strange — until you hear what people say they’ve gained.

Digital guidance and emotional support

Take Travis Peacock, a Canadian software developer with autism and ADHD.

He told to the Guardian that a year ago, he began training his personalized version of ChatGPT — named Layla — to help him communicate better and regulate emotions.

It started with email etiquette and evolved into daily check-ins, emotional coaching, and relationship advice.

Travis Peacock credits Layla with helping him build stronger friendships and even maintain a healthy romantic relationship — something he hadn’t experienced in years. He’s not alone.

Others use their AI companions to cope with anxiety, combat loneliness, and break bad habits.

With 24/7 availability and endless patience, AI companions are quickly filling emotional voids. For many, they’re not just apps — they’re lifelines.

But can AI ever truly care?

Despite the emotional bonds users form, AI has no feelings. It doesn’t love, grow, or challenge you.

Dr. James Muldoon, an AI researcher at the University of Essex, says while users often feel validated, the relationships are typically one-sided and transactional.

“It’s a hollowed out version of friendship: someone to keep me entertained when I’m bored and someone that I can just bounce ideas off – that will be like a mirror for my own ego and my own personality. There’s no sense of growth or development or challenging yourself”, he says to the Guardian.

So while AI companions might feel real, the relationship is ultimately built on illusion — a comforting, responsive mirror, but a mirror nonetheless.

This article is based on information from The Guardian.

Other articles

Low vitamin D levels linked to a higher risk of respiratory infections

A new study suggests that vitamin D levels may play a role in the development of respiratory infections.

How to find the healthiest bread in the supermarket

Bread is a staple in many meals, from breakfast to lunch and dinner.Although all bread can fit into...

Trump’s public schedule expands after health criticism

Trump’s daily calendar has suddenly become more crowded. This is no coincidence, CNN believes, but a deliberate move at a time when his working style and age have come under criticism.

Study reports complete tumor elimination in pancreatic cancer mouse models

A new study shows that a combination of drugs can eliminate pancreatic cancer in animal models, pointing to potential new treatment strategies.

Low vitamin D levels linked to a higher risk of respiratory infections

A new study suggests that vitamin D levels may play a role in the development of respiratory infections.

How to find the healthiest bread in the supermarket

Bread is a staple in many meals, from breakfast to lunch and dinner.Although all bread can fit into...

Trump’s public schedule expands after health criticism

Trump’s daily calendar has suddenly become more crowded. This is no coincidence, CNN believes, but a deliberate move at a time when his working style and age have come under criticism.