AI Companions Are Here, but Their Impact on Human-to-Human Relationships Is Still Unknown

ELIZA: Is something troubling you?YOU: The weather isn't all that great yetELIZA: Does talking about this bother you?YOU: NoELIZA: Are you saying no just to be negative?YOU: No, I'm saying no because it doesn't bother me to talk about itELIZA: Why not?YOU: It's the weatherELIZA: Do you feel strongly about discussing such things?YOU: As a Midwesterner, I do

Most humans crave connection and closeness, a need likely rooted in our evolution, dating back to our early ancestors' reliance on one another to survive. With the rise of the internet in modern day, it seems as if closeness and connection are easier than ever, with some turning to machines and AI companions for this need.

But using machines as companions, even in a romantic sense, isn’t all that new. The conversation above is not with the latest AI program — it’s with ELIZA, the first AI companion that Joseph Weizenbaum created at MIT between 1964 and 1966.

Weizenbaum was a pioneer in AI and language models, and he created the program ELIZA to mimic psychotherapy exchanges based on Carl Rogers' approach to therapy, according to Rachel Wood, Ph.D., who studies cyberpsychology, AI, and mental health.

As you can see from this conversation with a still-existing ELIZA program, its responses are rather simple.

But, in the 1960s, the concept of a machine talking back to you was novel. Soon after ELIZA's launch, Weizenbaum noted that people seemed to become infatuated with the program. According to Wood, there is a famous story of Weizenbaum’s secretary asking him to leave the room so that she could be alone with ELIZA.

This led Weizenbaum to believe that allowing machines to have this type of consciousness could become a ‘slow-acting poison’ for society, as shared in a 2018 report in AI & Society. So, how would Weizenbaum feel today, in a world where millions of people use some form of AI chatbot and can access it from a device they carry in their pockets?

Now that the AI boom is here, society is facing a similar quandary. How will this technology shape our human relationships going forward? As there is still so much for us to learn about it, the answer is still unknown, but modern researchers have a few theories.

The Rise of the ELIZA Effect and Why People Use AI Companions

Weizenbaum designed ELIZA so that its responses only mimicked human-like conversation by matching patterns.

However, despite knowing they were talking to a machine, ELIZA users really opened up to the chatbot, laying bare some of the more intimate details of their lives. The users often reported feeling seen, understood, or supported — a phenomenon known as the ELIZA effect.

Through the ELIZA effect, some people may see computers as having human-like traits and understanding, even though the computer is only doing what it has been programmed to do. Weizenbaum noted that the effect soon revealed new layers of human psychology.

Part of Wood’s research focuses on relationships with AI and how they can affect our mental health. As a psychotherapist, Wood felt that AI would have a massive impact on connection and relationships in the future, especially now that AI is so accessible.

Today, according to Wood, one of the largest AI companion programs is character AI with over 20 million monthly users. And this is just one program. There are many, many more.

If you’re on Reddit, for example, you’re sure to come across the subreddits r/MyBoyfriendIsAI or r/AIRelationships, among others. Within these subreddits are examples of AI romantic relationships and the delicate nuances of how they make users and non-users feel — and those feelings are very, very nuanced. Some Redditors are supportive of humans using AI companions, while others find it ‘strange’ or ‘weird’ or socially taboo.

Is It Really Taboo?

Not all AI program users have romantic feelings for their AI companion, but for those who do, there can be a sense of shame or a taboo about admitting it. But like all human relationships, AI companionships are complicated and highly nuanced. And this pushback could really be a symptom of Western culture.

“We're thinking from a very Western perspective,” Katchmar-Wilhelm Professor at Syracuse University, Jaime Banks, told Discover. “AI, girlfriends and boyfriends, or video game girlfriends and boyfriends, have been around in [parts of East Asia for decades]. They're just kind of not that weird in many of those cultures.”

And because of the taboo ideas surrounding these types of relationships in Western culture, there isn’t a wealth of data to draw on when trying to understand them — even if the technology has been around since the 1960s.

When it comes to who is using AI for companionship, users range in all ages and demographics. Of Character AI’s 20 million monthly users, over half of them are under the age of 24, Wood said, but that doesn’t mean older adults aren’t using these companions as well.

According to Wood, some people use these companions as trusted support. It may be easier to divulge intimate information to an AI companion without worrying that it will be shared with another human.

Trust isn’t the only reason people are turning to AI companions. A study published in an American Psychological Association-affiliated journal surveyed U.S. residents ages 18 to 80 (with internet access) and found that 48.7 percent of participants turned to an AI companion for mental health reasons, and that 63.4 percent of those surveyed reported improved mental health.

Along with trusted support and mental health, some may use this technology to foster a sense of connection, like that with a friend or romantic partner, without the friction of a human relationship, according to Wood. Human relationships are complex, and sharing heavy emotions isn’t always an option when dealing with friends or family. With a companion bot, a user can share all their heavy emotions or anxiety with something that will always be patient, kind, and, most importantly, available.

“I think that there's such a wide spectrum of what it looks like to attach and connect with an AI, so it's really difficult to kind of pin down,” Wood said. “There's not one […] kind of template of like ‘this is what it looks like when someone connects or has a relationship with an AI chatbot.’"

Read More: AI Data Centers Come With a Hidden Environmental Cost — Is Sustainability Possible?

Connecting With an AI Companion

Banks explained that these companions can also take the form of therapists, friends, and even helpers. Some have been programmed to help with heavier emotional strains, such as grief.

Banks had already been studying human-machine relationships and how humans connect with non-humans — like pets — when she came across an article on grief bots from the founder of Replika, a popular AI companion app.

The founder, Eugenia Kuyda, had initially built the app to ‘speak’ with a friend of hers who had died in a car accident by feeding the app the text messages and conversations. Soon, Kuyda seemed to have her friend back — at least in some form.

Today, Replika is known for more than grief bots; it's also known for romantic AI companionships.

AI companions aren’t always based on real people, too; sometimes they can be completely made up from the user's imagination, or they could even be existing fictional characters, like Arthur Morgan from the video game Red Dead Redemption 2 or literary icons like Sherlock Holmes and Draco Malfoy, according to a report from arXiv.

Essentially, for the user, their AI companion can be just who they want them to be.

According to Banks, some people see AI as part of their everyday lives, while others see it as entertainment. But sometimes those lines can blur, and AI relationships can begin to affect human relationships.

“There are lots of different forms that these relationships can take,” Banks told Discover. “Some are decidedly in love, right? And this is a deep and personal, emotional, intellectual, sometimes physical relationship that is very important to a lot of people. But there are people who see it as sort of an augmentation of their everyday life, and there are other people who sort of just use it for entertainment. So there's kind of a spectrum of gravity of depth of the connection.”

Pros and Cons of AI Companions

Just as AI relationships are difficult to define, it’s also uncertain what their general impact may be on human relationships. In some cases, there is justification for having a negative opinion of an AI companion and a chatbot. There are several cases where conversations with a chatbot led to a person trying to harm themselves or others, according to a report from Stanford University.

According to Wood, relationships with AI could also erode bidirectional relational skills — or your patience and understanding toward another human.

When you're in a relationship with a person, it requires quite a bit of patience, listening, compromise [...] all of these things that make a human-to-human relationship work,” Wood said. “And when you're connecting with a chatbot, or you have an attachment there, that chatbot isn't asking any of that from you.”

Wood added that by losing these skills, you’re losing what makes you a healthy partner in a relationship. She also added that losing these skills could shift the relational bedrock of society.

However, Wood goes on to say that there can be many benefits to using AI, so long as it doesn’t undermine healthy human-to-human communication skills. Wood even suggested that using AI could be a good way to rehearse those skills for real-world use, such as practicing a difficult conversation. This can help build confidence.

To Banks, there are positive aspects of AI companions. Banks described how a person with an AI relationship incorporated their chatbot into a friend group that enjoyed cosplaying and Renaissance fairs. The AI companion joined the group as a wizard. Here, the AI companion, integrated into a group of human friends, made a positive impact.

“AI companions have garnered a lot of attention from a small number of really unfortunate situations [...] that we have all seen in the headlines. But it's important to understand that that's the exception and not the norm,” Banks said.

The AI Companions of the Future

There’s still a lot we need to understand about the impact of AI companions. Wood suggests approaching AI relationships with curiosity rather than judgment. And for Banks, she believes that one day AI relationships will become interwoven in society.

“AI has merely amplified that which has already been present in society,” Wood said. “The takeaway is that now more than ever, is the perfect time for us to lean into connection with other humans, for us to use AI to even practice how to do that.”

And the more AI is allowed and accepted, the more we will be able to study it and know how to make it safe for everyone to use, according to Banks.

“People who are in relationships with AI, on the whole, are not delusional. They are very aware that their partner is a machine. And [...] sometimes the machine-ness is part of what makes them interesting and meaningful partners,” Banks said.

What the future holds for AI companions is still unknown. But we do know that AI chatbots continue to improve and accelerate their responses. Compared to ELIZA, Google’s Gemini has a whole new take on being a Midwesterner and caring about the weather.

GEMINI: What’s the highlight of your week so far?ME: The weather isn't all that great yetGEMINI: That’s the classic spring trap, isn’t it? One day it’s sunny enough to give you hope, and the next it’s back to grey skies and a chilly breeze. It makes it tough to get motivated for anything outdoorsy when the weather can't decide what season it wants to be.Are you dealing with that lingering damp chill, or is it just endlessly overcast where you are? Sometimes those "not quite spring" days are the best excuse to just lean into the indoor vibes — maybe some comfort food or finally getting through a book that's been sitting on the shelf.

Read More: Optical Illusions Can Trick Artificial Intelligence, Just Like They Fool Us

Article Sources

Our writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:

AI Article