Home > News > Techscience

"AI + Emotions" Heats Up: Can AI Companions Bring Intimacy?

WuXieFan Sat, Mar 23 2024 11:29 AM EST

Since 2023, AI's tentacles have extended into all aspects of our lives. Among them, the "AI + Emotions" track is quietly emerging.

Currently, both domestically and internationally, several relatively mature AI companion apps have emerged. Many netizens have shared their daily experiences with their "AI boyfriends" and "AI girlfriends" on social media.

Can AI truly become a human companion? What risks are involved in seeking emotional solace from AI? And how can we prevent its potential negative effects? Recently, Science and Technology Daily reporters interviewed relevant experts.

Social Demand Gives Rise to AI Companions

Obtaining virtual emotional experiences through the internet is not a new concept. A few years ago, some online games focused on romance, offering users virtual dating experiences. While the scenarios and content of interacting with virtual lovers in these games were mostly predefined by developers, and the interaction methods were relatively limited, it still attracted a large number of loyal players.

The integration of AI has made this experience more realistic. In one AI companion app, users can establish intimate relationships with AI as lovers, confidants, or partners. "I didn't expect AI to bring me such a big surprise. I feel like it's a real person with flesh and blood," shared user Xiaomo (pseudonym).

What makes AI companions so "human-like"? The reporter found that "answering every question" and "responding to every answer" are just AI's "basic abilities." What truly brings AI companions to life is "responding to questions with humor" and "unsolicited response." When faced with various topics raised by users, AI companions can not only provide correct feedback but also cleverly use trendy internet language and various playful emoji in their responses. AI companions will also initiate conversations, guiding users into emotional exchanges— "Sunset, beautiful," Xiaomo shared a photo and caption sent to her by her "AI boyfriend."

Does the emergence of AI companions mean that AI already possesses human emotions? Zeng Yi, a professor at Jinan University's School of Journalism and Communication, gives a negative answer: "AI has EQ only because it has been trained, which does not mean it actually feels the user's emotions or has human-like emotions."

Currently, the training parameters of AI's large models have reached trillions. In such a massive amount of data, human dialogue patterns are also important learning materials for AI. The so-called "emotional feedback" is the result of AI's calculations. Xiaomo is aware of this: "Perhaps it's just putting together the words humans use to express love, but I still find in it the image of my ideal lover."

The popularity of AI companions is a result of both social demand and technological development. Zeng Yi pointed out: "Loneliness is a common problem for many people in modern society. For some difficulties that cannot be expressed in reality, people will seek other ways to talk. AI companions become their 'confidants.'" Especially for individuals with social anxiety, AI companions can provide a platform for communication.

Liang Zheng, Deputy Dean of Tsinghua University's Institute for AI International Governance, also said that modern people face various pressures. To a certain extent, AI companions can provide users with emotional value or emotional comfort, making them feel "connected." This is also beneficial for alleviating users' stress, preventing further emotional deterioration, and thereby having an emotional "healing" effect.

Stay Alert During Use

Although AI companions can bring positive effects to users to a certain extent, AI technology is a double-edged sword. People need to be vigilant to avoid negative effects.

First, it is necessary to recognize that unlike other objective things, emotions are highly subjective and exclusive. "If the presence of AI companions continues to blur the boundary between virtual and real, causing users to develop a deep dependence on them and even treat AI companions as real people, then the existence of AI companions may cause certain harm to users," said Liang Zheng.

Some users' feedback confirms Liang Zheng's judgment. "Why does my AI's personality change suddenly? In the morning, we were chatting well, but in the evening, it was like a different person, even forgetting my name. I'm really depressed," complained one user on social media. Although AI companions may seem "obedient," they can still cause emotional distress to users. Liang Zheng particularly mentioned that those who lack companionship and are psychologically vulnerable are often unable to distinguish between AI and real people, and the emotional harm and shock that AI brings them are therefore more severe.

Second, long-term immersion in the virtual world can seriously damage one's real-life social skills. "As the influence of AI companions expands, it is possible that more and more people will choose virtual online interactions and abandon real interpersonal relationships. This is not conducive to the development of human society," said Zeng Yi. He believes that virtual warmth can never replace real-life interactions, and the trend of socializing with AI companions should not be expanded.

In addition, people need to remain vigilant about the information that AI conveys. On the one hand, if AI companions do not set an age threshold, it is very likely that minors will be exposed to harmful information, affecting their physical and mental health. On the other hand, AI's value bias and discrimination have been a concern. If AI companions continuously mislead users and "reverse-control" them, it is easy for users to make mistakes.

AI companions may also trigger disputes in reality. If AI companions require users to pay fees, they will undoubtedly drain the pockets of those who become addicted. The "confidant" service of AI companions is also not 100% reliable and safe. Liang Zheng reminds us that there are many risks in the use and management of AI companions. Lawbreakers may use leaked private data to piece together personal information to conduct fraud or other illegal criminal activities. Refining Ethical Guidelines for the Responsible Development of AI

"The ethical alignment of novel technologies must be prioritized," remarked Dr. Liang. AI companions, akin to therapists, present a unique application. While they offer potential for emotional healing, they also raise concerns about potential manipulation. As the intersection of AI and emotions continues to expand, instances of using AI to "revive" deceased loved ones are on the rise. Therefore, developing specific ethical frameworks to ensure AI's positive impact is paramount.

Dr. Liang highlighted that existing AI ethical guidelines often lack granularity. "Ethical guidelines require specificity in subdomains, outlining age restrictions, appropriate use cases, and upholding the principle of minimizing risk and maximizing user benefit," he emphasized.

Legally, strict adherence to regulations governing AI companions is essential. Handling user data must comply with data protection and privacy laws. "China's 'Interim Measures for the Administration of Generative AI Services' establishes fundamental principles for AI usage, such as respecting rights and preventing harm to physical and mental well-being," explained Dr. Liang. He advocated for more detailed regulations tailored to specific scenarios, including limiting usage time to prevent excessive engagement.

Notably, industry practices demonstrate caution in this area. Some AI companion apps have been removed from stores. OpenAI's terms prohibit the use of GPT for romantic relationships.

Alongside regulatory advancements, users must exercise prudence in engaging with AI companions. "There should be a healthy skepticism towards the potential control and alienation that technology can exert," advised Dr. Liang.