970x125
Have you ever chatted with an artificial intelligence (AI) chatbot? I don’t mean to help with a task at work, but as a companion, a friend, or a therapist. Have you asked for advice in an argument or sought support during a period of loneliness?
Evidence shows that, for many of us, AI has become a core part of our everyday lives. Not just our work lives, but our personal lives, too. Indeed, more and more people are turning to AI companions—designed specifically to provide emotional support and a sense of connection. A recent report found that 72 percent of teens in the United States have interacted with an AI companion at least once, with more than half (52 percent) describing themselves as regular users.
While people generally accept that AI can be used to maximise efficiency and build skills—to help us draft emails, summarise large documents, or even learn to code—people are uneasy about AI replacing sensitive or “human” functions. Using AI companions to manage loneliness remains controversial.
A Public Health Crisis
Yet, when you look at the world we’re living in today, the pull makes sense. In May 2023, U.S. Surgeon General Vivek Murthy called loneliness a public health epidemic. Indeed, this isn’t just a social health crisis, it’s a public health crisis—with studies finding that those who experience chronic loneliness are not only less healthy; they’re also likely to die younger. And the crisis is widespread—a 2024 American Psychiatric Association poll found that one in three adults has experienced loneliness at least once a week over the past year, with 10 percent feeling lonely every single day.
People are finding it harder to connect, to find partners, and to forge meaningful friendships. For all those craving someone to talk to, what would be more attractive than a person who is always available, always attentive, and specifically programmed to meet your emotional needs and offer you affirmations?
In fact, the desire for companionship was a core driver of AI innovation. Replika, an app designed to create an AI friend, was designed by Eugenia Kuyda when her closest friend died. She wanted a way to process her grief, to keep talking to him. The appeal here is clear. I’m sure all of us who’ve lost someone dear would give anything for just one more conversation—to still have their advice, to still be reassured of their love and support. The thing is, while we can understand that this would bring some immediate comfort, we also know this won’t actually bring them back and that it probably isn’t the healthiest coping mechanism.
While Replika has helped millions, and not just those grieving but also those feeling socially isolated—the student who has moved cities or the person going through a divorce—each of these people would probably benefit more from real human connection.
Indeed, while recent studies show that people who turn to AI companions because they feel isolated, anxious, or detached may feel initial relief and comfort, in the longer term, this can turn into dependence, heightened loneliness, and further isolation. While at the start it might feel like a true “relationship”—as powerfully depicted in Spike Jonze’s 2013 film “Her”—at some point you realise that there’s nobody on the other side.
An Evolutionary Cue
The thing is, while loneliness may be described as an epidemic, it isn’t a disease as such. It’s a symptom, an evolutionary cue. Just as thirst compels us to seek water and hunger drives us to find food, loneliness pushes us to reach out to other people. The discomfort of isolation is what nudged our ancestors to form bonds, build tribes, and cooperate—the very behaviours that helped us survive. In this way, many anthropologists describe human evolution not as “survival of the fittest” but “survival of the friendliest,” because it was our capacity for connection that allowed us to thrive. And that’s precisely why this matters: If loneliness is a biological trigger designed to restore our social fabric, then AI companions offer only an artificial fix. They soothe the symptom, but they don’t help us rebuild relationships, strengthen communities, or improve our social health. In fact, by masking the feeling altogether, they may make us less likely to reach out to the people we actually need.
What’s more, while the agreeable and affirming nature of AI companions may be superficially alluring, there’s a real risk that they will erode our social skills and reduce our capacity to form meaningful human connections. The idea of a companion who validates your concerns and never pushes back is appealing. But challenge is vital, especially in a society already struggling with polarisation. Imagine a child growing up with an AI companion that always listens without interruption and never offers an alternative perspective. They’d emerge into the world poorly socialised—expecting constant attention, talking over others, and assuming they’re always right. To rebuild cohesive communities, we need the exact opposite: We need people who are able to negotiate and compromise, listen, and understand.
And lastly, for individuals with poor mental health, these enabling relationships bring particular risks. Where AI companions are essentially “yes men,” we’ve seen the dangers realised with AI systems encouraging suicide and validating delusions. An AI system designed to maximise engagement, not well-being, can quickly slip from a best friend to a dangerous influence—because these systems aren’t built to help you live better; they’re built to keep you coming back.
Loneliness Essential Reads
This post isn’t meant to be an alarmist warning against technology, nor a call to halt innovation. AI companions may well play a meaningful role, supporting those who feel isolated. But we need to remember that while it may feel like your AI companion is the friend that’s always there for you, they don’t have your “best interests” at heart, and they aren’t really there. The risk here is that in turning toward technology, we will continue turning away from each other. If we begin to choose effortless companionship over the challenging but enriching reality of human relationships, we risk hollowing out the social connections that sustain us.

