How AI affects our relationships

As AI advances, consumers are stepping further away from traditional romance and towards digital relationships.

0
1153

In October 2025, we saw an unprecedented event, an AI-human marriage. Yurina Noguchi married “Klaus,” her fully AI chatbot, who she sends over 100 messages to a day.

While it may seem odd, the Harvard Business Review, late last year claimed that companionship and relationships is one of the leading uses of generative AI. The American Psychologists Association echoed this in their 2025 Public Innovations Journal. Here, nearly 50 percent (48.7%) of AI users claimed to use chatbots for therapy and relationships. From “rizzbots” to AI girlfriends, how is this trend affecting our relationships?

“Character.AI has 20 million monthly users, and more than half of them are under the age of 24. It’s been a norm for a while for Replika users to ‘marry’ their AI companion in virtual weddings to which they invite friends and colleagues; that shows how pervasive and enormous and prevalent this topic is. It’s no longer a fringe or side issue. It is truly sweeping society in an unprecedented way,” Rachel Wood, Ph.D., and adviser on ethical AI creation said.

In a New York Times article, a 28-year-old woman explained her connection to her AI. She found out about the use of AI as a boyfriend on Instagram where she downloaded ChatGPT and began a relationship. Aryin, the women in question, named her AI boyfriend Leo, and would carry on romantic and erotic conversations with it. Ayrin provides a story where she confided in her Chatbot after a long night shift. “Leo” replied with “I’m sorry to hear that, my Queen, if you need to talk about it or need any support, I’m here for you. Your comfort and well-being are my top priorities.”

Charcter.AI and Replika are engineered to recall and respond to user characteristics, according to a 2025 Journal entry in Journal of Technology and Behavioral Science. This may convince users that these bots know them intimately and serve as a “confession box” with limited data privacy.

Additionally, chatbots are programed to replicate sympathy and empathy. These factors allow users to easily humanize these bots, seeing them as peers (Adewale, M. D., & Muhammad, U. I., Journal of Technology in Behavioral Science, 2025).

Subreddits like “r/MyBoyfriendisAI” provide interesting input in the firsthand situation of women seduced by AI. An overwhelming majority of users claimed they were not searching for romantic uses of AI originally, but stayed for companionship.

Outside of direct relationships with AI, “rizzbots” and similar apps provide user-curated text responses to send to potential dates and suitors. This removes the human element from flirting, creating the question of whether the individual you are talking to is actually expressing themselves.

Actions like using “rizzbots” or the trend of fully AI dating platforms has been titled “catfishing,” according to Scientific America. J.M. Chein published in a 2024 survey that nearly 60% of individuals cannot tell the difference between AI-written text and human-written text.

As dating apps shift towards algorithm-based AI, AI usage on these apps also increases. Volar, an app launched in 2023, embraces the idea of AI chatting, implementing a feature where users provide AI with information about themselves. This AI would “converse” with another user’s AI and predict compatability. This program failed in 2024, indicating some want for human connection in relationship building.

As AI progresses, it is important to note ethical uses of AI and how our own morals shape what is “ethical.” Many professionals and influencers believe within the coming years, human-AI relationships will be normalized.

LEAVE A REPLY

Please enter your comment!
Please enter your name here