Mindful Marketing
  • Home
    • Ethics Challenge
  • About
    • Mission
    • Mindful Matrix
    • Leadership
  • Mindful Matters Blog
  • Mindful Marketing Book
  • Mindful Ads?
  • Contact

Falling in Love with AI

2/1/2026

12 Comments

 
Picture

by David Hagenbuch - professor of marketing at Messiah University -
​author of 
Honorable Influence - founder of Mindful Marketing -
author of Mindful Marketing: Business Ethics that Stick 

February 14th has long reminded people of the affection they feel for the most important others in their lives: spouses, fiancés, boyfriends, girlfriends. Thanks to AI, “significant other” can now mean other than human, but even if people desire human-like intimacy from artificial intelligence, should organizations offer it?
 
A colleague recently shared an article with me and several others that she found disheartening: Married women in China who find their real-life relationship with their spouse lacking are spending the equivalent of thousands of U.S. dollars a year on AI boyfriends. The digital rendezvous often occur in otome games  like Love and Romance, Light and Night, and Beyond the World.
 
I wasn’t entirely surprised by the article, as the issue has been on my radar, along with other AI-related concerns, for about two years, and over past several months, I’ve been tracking related stories such as these:
  • The increase in AI relationships could lead to a rise in divorces.
  • People are having “children” with chatbot partners.
  •  Parents are turning to chatbots to mind their young children.
  • Adult children are leaning on AI to substitute for their own communication with aging parents.
 
Still, in doing research for this piece, I was stunned by some of the usage statistics:
  • Since 2014, more than 660 million residents of China have used Xiaoice, the world’s “most popular chatbot,” which Microsoft “uniquely designed as an AI companion with an emotional connection to satisfy the human need for communication, affection, and social belonging.”
  • Nearly 20% high schoolers report that they or someone they know has had a romantic relationship with AI.
  • Nearly 20% U.S. adults have used AI to simulate a romantic partner, and within young adults age 18-30, 31% of men and 23% of women have used AI in this way.
  • Since its launch in 2017, the AI companion Replika has had 30 million users, while the similar product Character AI has 20 million active users. Over half of Character AI’s users are age 18 to 24, and around a fourth are 25 to 34.
 
Writing for Greater Good Magazine, Sahar Habib Ghazi says statistics like these suggest that “AI-human romance isn’t niche--it’s mainstream, especially among young adults.”
 
Since there have been people, there have been interpersonal relationships. Of course, some reasons for those relationships have been very practical, e.g., procreation, protection; however, humans also have simply sought each other’s company and companionship.
 
In more recent times, researchers have empirically studied humans’ sociological and psychological behaviors and developed theories to describe them. Maslow’s classic Hierarchy of Needs suggests that the desire for belonging is among the most basic of all human desires, preempted only by physiological needs (e.g., air, food, water) and the need for safety.
 
Indeed, most people want to be around other people, if not all or most of the time, some of the time. In fact, it’s so unusual for anyone to spurn social interaction entirely that the rare individual who does receives the label hermit or recluse.
 
With technology, even a recluse can get a ‘social fix’ through one-way interactions, such as by following influencers or watching TV shows with favorite actors, or regularly listening to a particular podcast. In these cases, the followers/viewers/listeners don’t really know the ‘celebrity others,’ yet the former often feel a sense of connection to the latter.
 
There also are ways to fulfill social needs without any people. Perhaps the most popular substitutes are pets, which many people regularly enjoy. Harvard Health reports that “pets can provide their owners with more than companionship,” and Psychology Today suggests that pets can be “friends.”
 
​
Picture
 
Similarly, farmers sometimes bond with the livestock for which they care, e.g., a lead cow. Some people even gain a sense of social interaction by nurturing immobile living beings, i.e., plants, which can help them feel less lonely.
 
Together these examples form a continuum on which people might find satisfaction of social needs, ranging from extensive human contact, to relatively little, to none.
 
There also are countless cases in which people use other things not to meet social needs but to shift their focus from them, e.g., work, hobbies, media. For instance, someone might immerse themself in their job to help take their mind off feelings of loneliness.
 
Given the many ways of meeting and masking social needs currently and historically, is there any reason not to accept AI as a relationship alternative? After all, it can produce more human-like interaction than virtually any of the secondary options. Some would even say better-than-human.
 
There are advantages and disadvantages of AI relationships. The following two lists are not exhaustive but seem to be some of the main pros and cons.
 
Pros:
  • Readily available: No person is accessible all the time to talk, listen, etc. Chatbots are available 24/7. They’re also extremely fast, and they don’t get tired.

  • Nonjudgmental: For many people, it’s hard to simply listen to others’ disclosures without sharing their opinions of them. Chatbots typically refrain from such appraisals, which can be especially helpful for people who experience social anxiety or mental health challenges.

  • Very smart: Of course, AI makes mistakes, but the vast repository of information it can draw from and assimilate means it doesn’t suffer from ignorance and inexperience to the extent that many people do. What's more, AI’s ability to sensitively apply its expansive knowledge base means it can seem “more ‘human’ than many people.”

  • Adaptable: As people, we can adapt to others’ needs but it’s hard because as we do, we often need to stretch ourselves or give up our own needs. AI doesn’t have those limitations; it can be 100% accommodating.
 
Cons:
  • Dependency: Given that AI is so readily available, accommodating, and reluctant to ever say “no,” there’s risk of dependency and even addiction. In fact, some humans who have found themselves spending far too much time with AI companions have turned to the app I Am Sober to help break their obsessive compulsive behavior.
 
  • Data vulnerability: There’s risk involved with any of the information we share on websites or enter into apps, but the risk is greatly magnified when considering the very sensitive information individuals are likely to reveal to their AI companions, whose discretion is only as great as that of the companies behind them.

  • Manipulation: Along with potential misuse of users’ data is the potential for users to be unknowingly manipulated into buying products that a chatbot’s parent company wants to promote. It’s hard to imagine that companies won’t seek to monetize those intimate relationships – something that would almost never happen with a human partner.
 
  • Unrealistic expectations: In keeping with the previous point, AI’s varied advantages over people can cause its users to show little tolerance for human imperfections. Instead, they expect the people in their lives to offer support at an AI level.
 
  • Not true love: Although those who use AI companions may experience a “form of ‘love,’” it’s not likely real love given that genuine love involves the desire to nurture another’s well-being, and chatbots “don’t have well-being to nurture.” By the same token, AI can “replicate” some dimensions of love, but what it’s offering is just that – imitation, not genuine love.
 
  • Mistakes: As time goes on, AI seems to be making fewer mistakes and having less frequent hallucinations; however, the nature of the mistakes have sometimes been disastrous, such as when AI has offered to serve as a suicide coach and write troubled teens’ farewell letters.
 
Another possible con I was going to list for AI companions was the inability for physical expression, e.g., a touch, a hug, a kiss. However, it probably shouldn’t be surprising that some technically savvy companies have integrated AI into sex dolls to create life-like sex robots.
 
Also, while writing this piece, I learned of a platform called Moltbook, a website where AI agents interact with each other. Humans can only observe; they cannot enter the conversations. The dialogue is both interesting and disconcerting. It portends a time when AI agents might go rogue, working against their human principals, not for them. If this prediction is in any way a real possibility, engaging a bot as a companion seems even more precarious.
 ​
Picture
  
Although my secondary research for pieces like this is helpful, it’s often even more valuable for me to gain insights directly from experts. In this instance, I reached out to Dr. John King, an associate professor of counseling at Liberty University, who is a Licensed Professional Counselor, a National Certified Counselor, and a former pastor. I asked for his perspective on human-AI relationships.
 
Dr. King has “seen firsthand the devastation affecting a generation” – addiction to gaming and particularly online pornography, especially among young men. He’s also witnessed a rise in mental illness from addiction to phones and related technologies, which he believes has resulted in “a second pandemic: Generalized Anxiety Disorder.”
 
He adds, “When ethics and morality lag behind technological advancement, it seems inevitable that AI‑based romantic relationships will further increase mental‑health struggles, particularly among adolescents and young adults whose brains are still developing.”
 
Dr. King, whose Christian faith informs his professional perspective, believes that because God created people for relationship with Him and other people, trusting technology for companionship risks idolatry and will inevitably result in harm. For these reasons he hopes parents, religious leaders, educators, and government officials “will have the wisdom to address these issues proactively.”
 
Dr. King isn’t opposed to AI use. Like many of us, he uses AI for certain methodical tasks like proofreading; however, he stops well short of suggesting AI as a soulmate.
 
His perspective speaks to me, as I find it increasingly hard to envision the rewards of human-AI relationships outweighing the risks, either to the individual or to society.
 
As is the case with the six “pros” I outlined above, discussion of benefits of human-AI relationships almost always focuses on what the human user gains from the interaction. Benefits like 24/7 access are certainly appealing; however, the exclusive emphasis on getting misses the entire other half of healthy relationships – giving.
 
To at least some extent, the more people are getting their social needs met through AI, the less people are giving human support to others. Perhaps some individuals can effectively manage both types of relationships simultaneously, but it seems more likely that human-bot relationship time comes at the expense of human-human relationship time.
 
However, there’s another important concern beyond simple social need supply and demand. Humans are wired to give. Often the greatest satisfaction and fulfillment in life comes from giving: parents caring for children, spouses supporting each other, friends loving friends, neighbors helping neighbors, people uplifting strangers.
 
When individuals are engaged in AI relationships, to whom are they giving? The answer to that rhetorical question – no one – may be the foremost flaw of human-AI relationships.
 
Is there a place for human-AI relationships? Should companies offer them? Given some of the benefits mentioned above, I hesitate to answer “no” unequivocally. However, it seems AI organizations and the entities that regulate them should think very carefully about who has access to AI companions, for what reasons, and under what conditions.
 
For instance, age restrictions are an absolute necessity, minimum ones and perhaps maximum ones, or some type of cognitive test to protect people susceptible to manipulation because of cognitive decline. Should AI relationships be regulated like some pharmaceuticals are and require a prescription, or should AI relationships be subject to outside monitoring?
 
I wish I had better insights. What I do feel certain about is companies that make AI relationships easily available without setting limits and carefully considering likely individual and societal tolls are courting Single-Minded Marketing.
​
Picture
Subscribe to Mindful Matters blog.
Learn more about the Mindful Matrix.
Check out the book, Mindful Marketing: Business Ethics that Stick
12 Comments
    Subscribe to receive this blog by email

    Editor

    David Hagenbuch,
    founder of
    Mindful Marketing  and author of Honorable Influence
    and
    ​Mindful Marketing: Business Ethics that Stick

    Archives

    February 2026
    January 2026
    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014

    Categories

    All
    + Decency
    + Fairness
    Honesty7883a9b09e
    * Mindful
    Mindless33703c5669
    > Place
    Price5d70aa2269
    > Product
    Promotion37eb4ea826
    Respect170bbeec51
    Simple Minded
    Single Minded2c3169a786
    + Stewardship

    RSS Feed

    Share this blog:

    Subscribe to
    Mindful Matters
    blog by email

    Illuminating
    ​Marketing Ethics ​

    Encouraging
    ​Ethical Marketing  ​


    Copyright 2025
    David Hagenbuch

Proudly powered by Weebly