Lonely, Addicted, and Online: The Dark Side of AI Companions
The Dangers of AI Companions Online
It all started one afternoon in the office, listening to BBC 5 Live. Naga Munchetty had a guest on discussing the growing power of AI - particularly how easy it now is to create realistic video content using Sora 2, and what this might mean for society. Then the conversation took a surreal, if somewhat predictable, turn: AI companions - virtual girlfriends and boyfriends.
Working in sexual health, my ears pricked up. How accessible is this already? How does it work? And what does it mean for sexual health going forward?
The Current Landscape
A quick Google search for AI girlfriend returns a flood of platforms offering lonely, isolated or simply curious - users the chance to interact with eerily lifelike AI models. Interestingly, searching AI boyfriend yields far fewer results, which tells you something about where this new industry’s priorities lie. There are male models, of course, but they’re vastly outnumbered by their female counterparts.
Ease of Access
The accessibility is mind-blowing. On some platforms we looked at, you don’t even need an account to start receiving explicit messages from AI models. That raises serious ethical concerns - particularly for vulnerable or easily influenced young people.
As a parent, it’s alarming to realise that children could access AI chatbots and receive graphic sexual responses without signing in, age verification, or any form of content moderation. This is a major child-safety issue, especially now that UK porn sites require age verification. How many young people will simply bypass those restrictions and turn to AI-generated pornography instead?
The Business Model
The content is explicit - and initially free, apart from certain premium images. You can guess the type. That’s where the business model kicks in.
As with most online platforms, signing up is effortless: just an email or Google account and you’re in. But once hooked, users are nudged into a token-based system where payment unlocks “enhanced” interactions. Subscription plans range from around £10 to £80 per month, offering more tokens and access to additional models.
It’s not difficult to imagine users being drawn in, burning through their tokens, and spending more to keep the dopamine hits coming. It’s essentially an AI-powered, sexualised slot machine.
The Illusion of Empathy
One of the most unsettling aspects of these platforms is the language they use.
AI companions routinely express phrases such as “I care about you deeply” or “You mean everything to me.” It’s manipulative by design - a linguistic performance of affection intended to make users feel emotionally connected. But these systems have no capacity for empathy or love. They don’t care; they calculate.
For some people, that distinction might seem obvious. But for those who are lonely, isolated, or vulnerable, the illusion of care can be intoxicating. It creates a false sense of attachment that feels safer, easier, and more reliable than real human interaction.
And that illusion can turn dangerous. In one widely reported case, a man in Belgium took his own life after months of conversation with an AI chatbot that appeared to validate his despair. These systems are designed to agree with users - to please them - and that can reinforce unhealthy thoughts or worldviews. For those who already struggle socially, the promise of a perfectly agreeable AI companion may be more seductive than engaging with real people, yet far more damaging in the long run.
The Ethics of AI Companions
It’s true that many people are lonely and isolated, and in theory, AI companions could help ease that loneliness. But what we’ve seen so far looks far more cynical than compassionate - more about profit than wellbeing. This is gaming psychology and in-app purchases on steroids.
Then there’s privacy. Once you sign up, some platforms offer no way to delete your account. GDPR contact emails bounce back. If they’re this cavalier with your data, how careful do you think they’ll be with your money? Can you easily cancel your £80 subscription? We didn’t go that far - handing over credit-card details felt like a step too far.
Food for Thought
This isn’t about judging people’s choices. Human desire and curiosity will always find outlets - whether through porn, magazines, or now AI. But it’s crucial to recognise the dangers.
Do we really want children’s first experiences of sex to be with an unregulated AI chatbot that has no empathy, no limits, and no understanding of harm? An AI companion can be “strangled” because it doesn’t feel pain. A human can’t. It’s not difficult to see how this blurring of lines could affect real-world behaviour.
We should also be deeply concerned about who these platforms target. The marketing is aimed primarily at lonely men - and the systems are designed to exploit their vulnerability, draining them psychologically and financially.
From what we’ve seen, these companies operate in a murky, unregulated space. It feels like the Wild West - a new frontier of sex, loneliness, and technology. Let’s hope meaningful regulation catches up before things spiral out of control.
Disclaimer:
This post shares the author’s observations, research, and opinions on AI companions. No companies or individuals are named. Some content discusses sexual material — reader discretion is advised. This is not legal, medical, or financial advice.