Mindful Marketing
  • Home
    • Ethics Challenge
  • About
    • Mission
    • Mindful Matrix
    • Leadership
  • Mindful Matters Blog
  • Mindful Marketing Book
  • Mindful Ads?
  • Contact

Falling in Love with AI

2/1/2026

12 Comments

 
Picture

by David Hagenbuch - professor of marketing at Messiah University -
​author of 
Honorable Influence - founder of Mindful Marketing -
author of Mindful Marketing: Business Ethics that Stick 

February 14th has long reminded people of the affection they feel for the most important others in their lives: spouses, fiancés, boyfriends, girlfriends. Thanks to AI, “significant other” can now mean other than human, but even if people desire human-like intimacy from artificial intelligence, should organizations offer it?
 
A colleague recently shared an article with me and several others that she found disheartening: Married women in China who find their real-life relationship with their spouse lacking are spending the equivalent of thousands of U.S. dollars a year on AI boyfriends. The digital rendezvous often occur in otome games  like Love and Romance, Light and Night, and Beyond the World.
 
I wasn’t entirely surprised by the article, as the issue has been on my radar, along with other AI-related concerns, for about two years, and over past several months, I’ve been tracking related stories such as these:
  • The increase in AI relationships could lead to a rise in divorces.
  • People are having “children” with chatbot partners.
  •  Parents are turning to chatbots to mind their young children.
  • Adult children are leaning on AI to substitute for their own communication with aging parents.
 
Still, in doing research for this piece, I was stunned by some of the usage statistics:
  • Since 2014, more than 660 million residents of China have used Xiaoice, the world’s “most popular chatbot,” which Microsoft “uniquely designed as an AI companion with an emotional connection to satisfy the human need for communication, affection, and social belonging.”
  • Nearly 20% high schoolers report that they or someone they know has had a romantic relationship with AI.
  • Nearly 20% U.S. adults have used AI to simulate a romantic partner, and within young adults age 18-30, 31% of men and 23% of women have used AI in this way.
  • Since its launch in 2017, the AI companion Replika has had 30 million users, while the similar product Character AI has 20 million active users. Over half of Character AI’s users are age 18 to 24, and around a fourth are 25 to 34.
 
Writing for Greater Good Magazine, Sahar Habib Ghazi says statistics like these suggest that “AI-human romance isn’t niche--it’s mainstream, especially among young adults.”
 
Since there have been people, there have been interpersonal relationships. Of course, some reasons for those relationships have been very practical, e.g., procreation, protection; however, humans also have simply sought each other’s company and companionship.
 
In more recent times, researchers have empirically studied humans’ sociological and psychological behaviors and developed theories to describe them. Maslow’s classic Hierarchy of Needs suggests that the desire for belonging is among the most basic of all human desires, preempted only by physiological needs (e.g., air, food, water) and the need for safety.
 
Indeed, most people want to be around other people, if not all or most of the time, some of the time. In fact, it’s so unusual for anyone to spurn social interaction entirely that the rare individual who does receives the label hermit or recluse.
 
With technology, even a recluse can get a ‘social fix’ through one-way interactions, such as by following influencers or watching TV shows with favorite actors, or regularly listening to a particular podcast. In these cases, the followers/viewers/listeners don’t really know the ‘celebrity others,’ yet the former often feel a sense of connection to the latter.
 
There also are ways to fulfill social needs without any people. Perhaps the most popular substitutes are pets, which many people regularly enjoy. Harvard Health reports that “pets can provide their owners with more than companionship,” and Psychology Today suggests that pets can be “friends.”
 
​
Picture
 
Similarly, farmers sometimes bond with the livestock for which they care, e.g., a lead cow. Some people even gain a sense of social interaction by nurturing immobile living beings, i.e., plants, which can help them feel less lonely.
 
Together these examples form a continuum on which people might find satisfaction of social needs, ranging from extensive human contact, to relatively little, to none.
 
There also are countless cases in which people use other things not to meet social needs but to shift their focus from them, e.g., work, hobbies, media. For instance, someone might immerse themself in their job to help take their mind off feelings of loneliness.
 
Given the many ways of meeting and masking social needs currently and historically, is there any reason not to accept AI as a relationship alternative? After all, it can produce more human-like interaction than virtually any of the secondary options. Some would even say better-than-human.
 
There are advantages and disadvantages of AI relationships. The following two lists are not exhaustive but seem to be some of the main pros and cons.
 
Pros:
  • Readily available: No person is accessible all the time to talk, listen, etc. Chatbots are available 24/7. They’re also extremely fast, and they don’t get tired.

  • Nonjudgmental: For many people, it’s hard to simply listen to others’ disclosures without sharing their opinions of them. Chatbots typically refrain from such appraisals, which can be especially helpful for people who experience social anxiety or mental health challenges.

  • Very smart: Of course, AI makes mistakes, but the vast repository of information it can draw from and assimilate means it doesn’t suffer from ignorance and inexperience to the extent that many people do. What's more, AI’s ability to sensitively apply its expansive knowledge base means it can seem “more ‘human’ than many people.”

  • Adaptable: As people, we can adapt to others’ needs but it’s hard because as we do, we often need to stretch ourselves or give up our own needs. AI doesn’t have those limitations; it can be 100% accommodating.
 
Cons:
  • Dependency: Given that AI is so readily available, accommodating, and reluctant to ever say “no,” there’s risk of dependency and even addiction. In fact, some humans who have found themselves spending far too much time with AI companions have turned to the app I Am Sober to help break their obsessive compulsive behavior.
 
  • Data vulnerability: There’s risk involved with any of the information we share on websites or enter into apps, but the risk is greatly magnified when considering the very sensitive information individuals are likely to reveal to their AI companions, whose discretion is only as great as that of the companies behind them.

  • Manipulation: Along with potential misuse of users’ data is the potential for users to be unknowingly manipulated into buying products that a chatbot’s parent company wants to promote. It’s hard to imagine that companies won’t seek to monetize those intimate relationships – something that would almost never happen with a human partner.
 
  • Unrealistic expectations: In keeping with the previous point, AI’s varied advantages over people can cause its users to show little tolerance for human imperfections. Instead, they expect the people in their lives to offer support at an AI level.
 
  • Not true love: Although those who use AI companions may experience a “form of ‘love,’” it’s not likely real love given that genuine love involves the desire to nurture another’s well-being, and chatbots “don’t have well-being to nurture.” By the same token, AI can “replicate” some dimensions of love, but what it’s offering is just that – imitation, not genuine love.
 
  • Mistakes: As time goes on, AI seems to be making fewer mistakes and having less frequent hallucinations; however, the nature of the mistakes have sometimes been disastrous, such as when AI has offered to serve as a suicide coach and write troubled teens’ farewell letters.
 
Another possible con I was going to list for AI companions was the inability for physical expression, e.g., a touch, a hug, a kiss. However, it probably shouldn’t be surprising that some technically savvy companies have integrated AI into sex dolls to create life-like sex robots.
 
Also, while writing this piece, I learned of a platform called Moltbook, a website where AI agents interact with each other. Humans can only observe; they cannot enter the conversations. The dialogue is both interesting and disconcerting. It portends a time when AI agents might go rogue, working against their human principals, not for them. If this prediction is in any way a real possibility, engaging a bot as a companion seems even more precarious.
 ​
Picture
  
Although my secondary research for pieces like this is helpful, it’s often even more valuable for me to gain insights directly from experts. In this instance, I reached out to Dr. John King, an associate professor of counseling at Liberty University, who is a Licensed Professional Counselor, a National Certified Counselor, and a former pastor. I asked for his perspective on human-AI relationships.
 
Dr. King has “seen firsthand the devastation affecting a generation” – addiction to gaming and particularly online pornography, especially among young men. He’s also witnessed a rise in mental illness from addiction to phones and related technologies, which he believes has resulted in “a second pandemic: Generalized Anxiety Disorder.”
 
He adds, “When ethics and morality lag behind technological advancement, it seems inevitable that AI‑based romantic relationships will further increase mental‑health struggles, particularly among adolescents and young adults whose brains are still developing.”
 
Dr. King, whose Christian faith informs his professional perspective, believes that because God created people for relationship with Him and other people, trusting technology for companionship risks idolatry and will inevitably result in harm. For these reasons he hopes parents, religious leaders, educators, and government officials “will have the wisdom to address these issues proactively.”
 
Dr. King isn’t opposed to AI use. Like many of us, he uses AI for certain methodical tasks like proofreading; however, he stops well short of suggesting AI as a soulmate.
 
His perspective speaks to me, as I find it increasingly hard to envision the rewards of human-AI relationships outweighing the risks, either to the individual or to society.
 
As is the case with the six “pros” I outlined above, discussion of benefits of human-AI relationships almost always focuses on what the human user gains from the interaction. Benefits like 24/7 access are certainly appealing; however, the exclusive emphasis on getting misses the entire other half of healthy relationships – giving.
 
To at least some extent, the more people are getting their social needs met through AI, the less people are giving human support to others. Perhaps some individuals can effectively manage both types of relationships simultaneously, but it seems more likely that human-bot relationship time comes at the expense of human-human relationship time.
 
However, there’s another important concern beyond simple social need supply and demand. Humans are wired to give. Often the greatest satisfaction and fulfillment in life comes from giving: parents caring for children, spouses supporting each other, friends loving friends, neighbors helping neighbors, people uplifting strangers.
 
When individuals are engaged in AI relationships, to whom are they giving? The answer to that rhetorical question – no one – may be the foremost flaw of human-AI relationships.
 
Is there a place for human-AI relationships? Should companies offer them? Given some of the benefits mentioned above, I hesitate to answer “no” unequivocally. However, it seems AI organizations and the entities that regulate them should think very carefully about who has access to AI companions, for what reasons, and under what conditions.
 
For instance, age restrictions are an absolute necessity, minimum ones and perhaps maximum ones, or some type of cognitive test to protect people susceptible to manipulation because of cognitive decline. Should AI relationships be regulated like some pharmaceuticals are and require a prescription, or should AI relationships be subject to outside monitoring?
 
I wish I had better insights. What I do feel certain about is companies that make AI relationships easily available without setting limits and carefully considering likely individual and societal tolls are courting Single-Minded Marketing.
​
Picture
Subscribe to Mindful Matters blog.
Learn more about the Mindful Matrix.
Check out the book, Mindful Marketing: Business Ethics that Stick
12 Comments
Gisaiah Griffin
2/4/2026 12:22:10 pm

At first, I found the article comical and asinine. That some people were having real-world relationships with AI itself. But putting myself into the situation allowed me to reflect more clearly. The article stated that with Maslows hierchy of needs, companionship or a sense of belonging with the people around is very important. So important that the fact that there is a real life situations where people are in relationships with AI should be telling.

Reply
Hannah Rayner
2/4/2026 02:19:14 pm

This article highlights a significant and still-growing issue in society. While opinions on AI are varied, I hope no one views a romantic relationship with it as a good thing. In my mind, it only shows how people's lives are becoming more digitized and less real/personal. If we conduct even our romantic relationships through a screen, we miss the necessary HUMAN aspect of community and belonging.

Reply
Ben Graham
2/4/2026 05:03:05 pm

This article was definitely an interesting read. I thought it was quite idiotic that people were actually having relationships with AI. I still stand by the fact that this is idiotic no matter how lonely or lacking companionship one may be. However, I do agree that having company and relationships do help a lot in life, so when one is lacking this aspect in life, they may become desperate. I pray I never end up like that.

Reply
Olivia Kirchner link
2/4/2026 05:05:13 pm

This article was interesting to read, as I never knew such a large amount of people were actually engaging in romantic relationships with AI to this degree. I had thought it was more taboo than it actually seems to be. It's especially hard to deal with an issue like this because of how new AI is. Longitudinal studies have not been conducted to see the long-term effects of interacting with artificial humans in this way; therefore companies advocating or providing people with this service should be much more cautious and have more boundaries, if they have to offer the AI relationship at all.

Reply
Emilie Stefanchik
2/4/2026 07:20:48 pm

I heard about this happening in the news not too long ago, but still after reading this article it baffles me how someone turns to a computer for a relationship. Especially since some were divorcing to begin the relationship with the AI. But I think what surprised me the most was the percentage of high schoolers who reported knowing someone had a romantic relationship with AI. I find it so important to be around other humans as the less time you spend with other people or even a pet or plant the more you spend on devices can be detrimental to your mental and physical health.

Reply
Abel Brunk
2/4/2026 07:47:01 pm

I thought this article highlights a deep-rooted problem of the digital age for our society. That problem is isolationism, where people, especially younger generations, are spending so much time online that they don't want to or can't properly interact with real people. I personally think it is absurd to have romantic feelings towards a computer program, and I think the people who are looking to just have "someone" to affirm everything they say need to face their anxiety or whatever is holding them back from interacting with real people. If I were struggling with interpersonal interactions, I would work hard to face that anxiety because it is important to make lasting relationships with other people.

Reply
Hailey Fegan
2/4/2026 08:37:48 pm


I read the article titled “Falling in love with AI. I found this article interesting because it talked about an issue that our society deals with today—the unethical use of AI. I found it really weird how people were actually having relationships with AI. I find it odd that people fulfill their psychological and social needs by developing a relationship with AI. Not only is this unethical, but just weird. I do think there are some good purposes to AI, but developing a relationship with it is not one of them.

Reply
Dylan Hart link
2/4/2026 10:03:15 pm

I found this article interesting because of the rise of AI use isn't just for technical reasons, but for social reasons as well. I also find it concerning that people are trying to fulfill their social needs with an empty substance, like an AI chatbot. I also think people use these empty "substances" to escape reality, in the sense that people think they can have children with an avatar AI chatbot. I agree with Dr. King's assessment of the situation, especially in the sense that AI will be turning into the next pornography.

Reply
Angeline Delaluz
2/4/2026 11:02:23 pm

Upon reading this article, I have been exposed to a new perspective of AI relationships. Until now, I did not realize how addictive it could be to partake in a relationship (even a romantic one) with AI. I did not think it was possible to form a deep connection with AI. I believe the people who are being fufilled by these romantic relationships with AI have the illusion that their is depth to their relationship but it is artifical and superficial. I understand farmers having a relationship with their live stock and dogs being friends to humans but where I draw the line is thinking it is ok to have a romantic relationship with AI. It was absolutely not God's plan for us to have a romantic relationship with AI but rather a relationship with Him and with those around us (and a romantic relationship between a man and a woman like it is laid out in Genesis). Reading this article was a wake up call about the power of technology. It is time to step away from technology and priotize having deep connections with humans.

Reply
Elijah Perry
2/5/2026 10:12:42 am

The article I am adressing is the Falling in Love with AI article. My first thought on the article was disbelief with the idea that someone would actually have a "real" "dating" relationship with AI. As I read more, I realized for some, especially those struggling with mental health issues or just a lack of social interaction, etc. this may not be so far off. I was particularly interested in the pros and cons of even a standard relationship or interaction with AI that was mentioned in the article. For one, the idea of having someone to talk to and give your meaningful insight to things you ask all at a very fast pace is enticing. On the other hand the real serious problem of data vulnerability stuck out to me knowing how real personal data can be a tool for destruction in the hands of the wrong person. Overall the marketing matrix graph clearly defines this type of problem between humans and AI as single-minded, as it is undeniably efficient but is ethical for some but arguably unethical for most.

Reply
Evan Sarkett
2/5/2026 10:13:53 am

One problem with interacting with primarily AI social companions is that is very easy, and requires hardly any effort or chance of failure. One of the most impactful times in relationships is when there is conflict and misunderstanding, and working through the difficult times is what leads to deeper and stronger relationships. In all things, taking shortcuts can rob us of the full value things. Using AI on a homework assignment might get the homework done, but no real learning or growth occurs. In the same way, Using AI for social interaction might fulfill an immediate need, but hamper emotional growth and perserverance.

Reply
Austin Turner
2/5/2026 10:43:48 am

The article I read was "Falling in Love with AI." The issue that I found most concerning was people using AI as a therapist rather than talking to someone about it in person. From my experience, conflicts in a relationship help to strengthen the bond rather than break it apart. With AI there is no real conflict and no true connection between the user and the computer, leading to just information being given not advice. If someone is struggling with mental health problems I believe the most logical thing is to talk it out with someone you trust rather than an AI that knows nothing about you.

Reply



Leave a Reply.

    Subscribe to receive this blog by email

    Editor

    David Hagenbuch,
    founder of
    Mindful Marketing  and author of Honorable Influence
    and
    ​Mindful Marketing: Business Ethics that Stick

    Archives

    February 2026
    January 2026
    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014

    Categories

    All
    + Decency
    + Fairness
    Honesty7883a9b09e
    * Mindful
    Mindless33703c5669
    > Place
    Price5d70aa2269
    > Product
    Promotion37eb4ea826
    Respect170bbeec51
    Simple Minded
    Single Minded2c3169a786
    + Stewardship

    RSS Feed

    Share this blog:

    Subscribe to
    Mindful Matters
    blog by email

    Illuminating
    ​Marketing Ethics ​

    Encouraging
    ​Ethical Marketing  ​


    Copyright 2025
    David Hagenbuch

Proudly powered by Weebly