Will AI Companions End the Loneliness Epidemic?
Researchers claim AI companions will make people feel understood and less isolated, but at what price?

If you enjoyed this article, please share it with a friend or consider becoming a paid subscriber. Wednesday’s article is always free. Sunday’s article is for paid subscribers only.
I sometimes talk to dead people. Well, mostly one — my mom. But I don’t release words like a child releases a red balloon. I am far too pragmatic for that.
I speak to a photo of my mom.
The photo is a memory inside a memory. My mother is wearing a peach dress that warms the bloom in her cheeks. The strands of her hair are spun gold, catching the afternoon sun that filters through the lace curtains behind her. I can almost smell the faint trace of her lavender perfume.
The picture is a negative of what came later, when cancer dug deep into her bones, greying that glow layer by layer. The woman in this photo knew nothing of the fight that awaited her or the disease that would strip her down to a fragile shell.
Sometimes, I clutch the wooden frame, willing her to step out of it. I would give anything for one moment again with the woman in the picture.
I am probably not alone in this odd habit of talking to photos. Photos keep our memories in our mind’s eye. That’s why photography once freaked Victorians out. Many feared the camera would steal someone’s soul. But that’s usually how humans habituate to a new technology. First, it creates fear. Then, it creates awe. Finally, it becomes a seamless part of our lives we take for granted.
Today, photos are 1s and 0s of recorded memories. But imagine if a loved one could step out of the picture frame. Imagine if the person you miss the most could speak to you with the same mannerisms, vocabulary, gestures, and even the same voice. Would a simulated version of the dead bring someone back to life?
In 2015, programmer Eugenia Kuyda wondered the same. She lost her friend Roman Mazurenko after he was killed in a hit-and-run car accident. Eugenia missed her conversations with Roman and needed a way to feel like he was still part of her life.
But all she had were memories, photos, and old text messages. That’s when it hit her. She could feed a chatbot Roman’s text messages and teach it to replicate her friend.
The program, named Replika, exploded in popularity. Millions of bereaved users signed up to virtually clone their lost loved ones. Since then, Replika has branded itself as an AI Companion with a therapy twist.
Replika’s AI Companion promises to;
“…always be by your side no matter what you’re up to. Chat about your day, do fun or relaxing activities together, share real-life experiences in AR, catch up on video calls, and so much more.”
It sounds like the perfect relationship…if you prefer your partner with an off switch.
Unlike your garden-variety AI apps — service bots, educational tutors, or virtual assistants like Siri — Replika companions pretend to care. AI companions, including Nomi, Kindroid, Character.ai, Candy.ai, and E.V.A., are the latest synthetic stand-ins designed to offer emotional support in a world of increasing loneliness.
Of course, machines don’t actually feel anything. But it doesn’t really matter if they are sentient or not. All that matters is that we believe they are.
The ELIZA Effect
AI Companions might seem like characters from the latest dystopian novel, but they are not new. In the 1960s, when computers were the size of a Buick, and just as personable, computer scientist Joseph Weizenbaum invented a computer program to act as a therapist. He called his therapy bot Eliza — named after the protagonist, Eliza Dolittle, in George Bernard Shaw’s play Pygmalion.
Now, before you start imagining a robotic Freud with an ergonomically designed couch, let’s clarify. Eliza was about as deep as a Twilight plot. Like some overpaid Rogerian psychoanalyst, the bot merely turned your statements into questions. For example, if you said, “I’m feeling down,” Eliza would respond, “Why are you feeling down?” — as if it cared.
Of course, Eliza didn’t understand a thing. It simply parroted back your words in a way that made people think they’d stumbled upon a computerized Confucius.
Still, Weizenbaum grew concerned. He observed that “short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.” Weizenbaum coined the term Eliza effect to describe the tendency to falsely attribute human thought processes and emotions to AI.
And that’s the scary part. If a glorified tape recorder like Eliza could make people feel heard, what happens when more sophisticated AI companions actually start to “understand” us?
AI companions — a cure or a Band-Aid to loneliness?
By now, we all know loneliness is about as good for us as a daily diet of deep-fried Twinkies. Loneliness — or, as the experts now call it, social disconnection — is more hazardous to your health than obesity, according to a 2023 report. Feeling isolated can even hike up your risk of premature death by 26%.
Unfortunately, loneliness is increasing while our social circles are decreasing. From 1985 to 2004, the average American’s social circle shrank by a third, while the number of people admitting they had no one to confide in tripled. By 2018, things hadn’t improved. A Pew Research Center study revealed that only about half of those surveyed felt they had someone to turn to all or most of the time.
It’s a bleak snapshot of modern life. The support network you could once count on has withered away, leaving many to wonder if anyone really cares.
Enter technology to the rescue. A recent study from the Harvard Business School examined whether AI companions could reduce loneliness. In six studies, researchers had lonely users interact with a regular chatbot (A Siri-type assistant) and an AI companion.
The verdict? While the AI assistant could write bad poetry, it didn’t give people the warm and fuzzies. Not so for the AI companion. It reduced users’ loneliness after only one session.
It makes sense when you unpack loneliness. In essence, loneliness is not strictly about being alone. Rather, it’s really a perceived lack of social and emotional support. So when an AI companion parrots back your words, you feel heard and loved. Or so the thinking goes.
Tony Prescott, a professor of cognitive robotics at the University of Sheffield, dug deeper. His book The Psychology of Artificial Intelligence suggests that relationships with AIs could serve as a social Band-Aid for the millions of people talking to their cats.
Sadly, when people sink deeper into loneliness, they often lose confidence and retreat further into isolation. AI Companions, Prescott contends, could be the rope to pull them out, offering a safe space to practice those rusty social skills without the risk of prickly human judgment.
If only we had a tool to practice conversation skills. If only…
Yeah, so I call bullshit on this research. Humans already invented a tool that allows them to practice their conversation skills — social media. Never in the history of humanity have we had the ability to connect with and speak to this many people.
Let’s stop and think about that. We have a tool that brings thousands of conversations into our homes, yet we have never been more lonely.
To be clear, AI companions can undoubtedly help at least some folks on the tail end of loneliness — the elderly, the visually impaired, the mental health sufferers, or the grief-stricken. However, I worry more about people in the middle of the bell curve — younger people still developing social skills and those shy, awkward folks who fear rejection. Those sensitive introverts will turn to AI to get what they cannot get from humans — understanding.
If that doesn’t make you slightly uneasy, consider this: Replika already boasts 2.5 million active users, with about half claiming to be in a romantic relationship with their AI companion.
Yes, you read that right. Although the rate of singlehood is at an all-time high, roughly 1.25 million people are in a romantic relationship with a bot.
And if you think you can’t form a parasocial relationship with a bot, think again. These synthetic connections start small. First, you speak to your AI companion for only a few minutes daily. Then, it becomes hours in your echo chamber. Then, days stretch into nights. Until, one day, you wake up and realize you don’t have a single friend to drive you to a colonoscopy.
Ever wonder why humans are at the top of the food chain? If you are a cynic, you might say it’s because we are violent apex predators. But that’s not the answer.
Just watch those thuggish gorillas, and you will understand why humans rule the world. When a gorilla shakes a fruit tree, they do not share the fruit. Humans do. We are at the top, thanks to prosocial intelligence. Cooperation is our sharpest weapon.
It’s also why loneliness is often driven by fear. The fear of being ostracized lies deep in our lizard brains. If the tribe abandoned you, you died. Human survival has always depended on playing well with others.
Sure, AI Companions might help with loneliness in the short term, but they also raise a disturbing question — if we turn to robots for comfort, what happens when we forget how to seek it from real people?
Carlyn Beccia is an award-winning author and illustrator of 13 books. Subscribe to Conversations with Carlyn for free content every Wednesday, or become a paid subscriber to get the juicy stuff on Sundays.
The TV Show Evil covered this pretty good.
What would happen if an AI took all your email, social media, texts, voice mails, etc that you had shared and the other person and created the persona via AI.
Now make it like Amazon's Alexa ...
The writers interpretation which I agree with is it would prevent closure, not allowing the person to finalize their grief, and to make a beautiful and wonderful relationships stay alive like a zombie or a shell of the former.
Is it evil? Maybe? If one considered that one could put themselves in a coffin of technology and stripped away from "All" humanity (though anymore that might be considered a blessing [using the TV Show Fallout as an example]), but it is not net-net life affirming as it puts pause on the future for the past.
What if we make it really evolve? Well then, we have a "west world" situation .... where niche's and people get commodified as 1's and 0's ... furthermore ... what and where are the ethical side guards to make sure one's "experience" isn't reused onto another's?
The reality is while each of our relationships are unique like a snowflake, the reality is the patterns of love, pain, happiness, and what not do fit into commonality patterns which with AI ... means the Tegel experience could be mimicked onto the Carlyn experience for some other person who may not realize that "their" experience with their lost person is not really 'that' person but a merge of the three.
So what is honest? All of it and none of it ... then you get the risk of AI's "echo chamber" if the data is not consistently refreshed with new and updated data and references, it becomes highly warped "fast".
The price IMO is very high, because humanity has not evolved the understanding of their fears and managing the worst excesses of ourselves in so much what might seem as heaven .... will turn into the mother of all hells "fast".
It is very similar to some of the other fun articles on sex you have written. "Watch what you wish for". Where one partner wants to play with multiple partners, or use bondage toys (like Chastity) or other aspects. What may seems like a fun experience ... well you know.
Real talk: when are we getting the Beccia Bot?