For many people, the digital era is a lonely one. In a fast-moving, always-on, and connected world, it can be harder than ever to maintain human relationships.
As artificial intelligence development advances, many people have begun turning to virtual companions and AI partners to cope with the loneliness epidemic.
According to research conducted by Andreessen Horowitz, and highlighted in an X post by Debarghya Das (Deedydas), an investor at Menlo Ventures, 8 out of the top 50 newcomer generative AI consumer web products are companionship/AI relationship apps.
These include Character, Janitor, CrushOn, Yodayo, Candy, SpicyChat, Chub, and DreamGF. It’s worth noting that these solutions range from your typical conversational companion to more highly sexualized virtual characters that offer pornographic roleplay.
Much of the interest in AI companions centers around AI girlfriends or boyfriends. As of May 2024, data from WordStream’s free keyword tool finds that the term “AI girlfriend” alone has a search volume of 135,000 per month in the US.
Techopedia explores the rise in virtual companionship.
Key Takeaways
- AI companion apps are surging in use, driven by a loneliness epidemic.
- 8 out of the top 50 new generative AI consumer web products are companionship or AI relationship apps, with significant search volumes for terms like “AI girlfriend”.
- Despite increased connectivity through technology, loneliness is a growing issue, with nearly 24% of people in 142 countries reporting significant loneliness.
- While AI virtual friend apps can provide comfort and a stepping stone to further interaction, there are concerns that they might increase dependency and reduce real human connections.
- Vendors should clearly communicate that AI chatbots, lacking true emotions and reasoning, are not substitutes for genuine human interaction and emotional support.
The Loneliness Epidemic
While we may be more connected through technology than ever before, many people are more lonely than ever before. A Meta-Gallup survey finds that almost 1 in 4 (24%) of people across 142 countries report feeling very or fairly lonely.
This can be why AI companions can provide a source of interaction, and this dynamic is explored often in fiction.
We need look no further than 2017’s Blade Runner 2049 and 2013’s Her, both films in which the main protagonist fell in love with virtual characters.
In written commentary to Techopedia, BBC Technology Presenter and host of TheFutureTECHShow Waseem Mirza agreed that loneliness was a prime motivator for users to use AI companions.
“There’s no denying that loneliness is a growing issue in our society. Busy lifestyles, social isolation, and even geographical factors can all contribute.
“AI companionship apps offer a potential solution by providing a readily available source of conversation, support, and even entertainment. They should be a valuable tool for those struggling with loneliness.”
That being said, Mirza does warn that these solutions do create some challenges that need to be considered.
“While they can be helpful, they shouldn’t be seen as a cure-all. Human connection is irreplaceable, and there’s no substitute for the richness of real-world relationships.
“However, for those struggling with social isolation, AI companions can provide a stepping stone to further interaction or a source of comfort during difficult times,” he added.
Are AI Companions Healthy?
One of the biggest questions from the growing use in AI companions or AI girlfriends is whether they are good for the user.
Do these solutions offer lonely people a way to feel less lonely during tough times or do they make the problem worse by driving them further away from human-connection and real emotional support?
“Even though they help lonely people, they can also make people dependent, less likely to connect with real people,” Rajesh Namase, co-founder and professional tech blogger at TechRT told Techopedia via written comments.
“Finding the right balance between AI companionship and real human bonds is very hard. To lessen the bad effects that could happen to society, strict rules and laws are needed.”
In any case, vendors need to highlight to users that virtual assistants based on large language models (LLMs) are incapable of thinking and feeling emotions in the way that human beings do.
As Yann LeCun, Meta’s Chief AI scientist, outlined recently in an interview with The Financial Times, LLMs don’t have logical understanding, lack persistent memory, and cannot reason.
This means that a user cannot have a relationship with a chatbot in its current state — it is, at the end of the day, an inanimate object.
By clarifying this to users, AI vendors can reduce the chance of people, particularly vulnerable people, seeking genuine emotional support from a chatbot that doesn’t think or feel.
Likewise, communicating with a chatbot cannot be considered a substitute for getting out, touching grass, and connecting with friends and family to build real, long-lasting relationships.
The Bottom Line
AI companions and AI partners are here to stay. And if someone finds value in a virtual companion in their life, who are we to say otherwise? After all, people have a right to make their own decisions.
However, that doesn’t mean we can’t call out these apps for being a poor substitute for real human connection and put pressure on LLM vendors to make clear that their chatbots are incapable of autonomous thought and empathy.