Mindful Marketing
  • Home
  • About
    • Mission
    • Mindful Meter & Matrix
    • Leadership
  • Mindful Matters Blog
  • Engage Your Mind
    • Mindful Ads? Vote Your Mind!
  • Expand Your Mind
  • Contact

Should AI Impersonate People?

7/1/2022

2 Comments

 
Picture

by David Hagenbuch - professor of Marketing at Messiah University -
​author of 
Honorable Influence - founder of Mindful Marketing 


“Imitation is the sincerest form of flattery”—it is a high compliment when people respect someone’s work enough to replicate it.  But, when one of the world’s largest companies’ smart speakers start imitating people’s voices, has flattery drifted into deceit?
 
It’s difficult to keep pace with innovation in artificial intelligence (AI), but one particular advance that's certainly worth attention is the impending ability of Amazon’s Alexa to mimic voices.  After hearing no more than a minute of audio, the smart speaker reportedly will be able to deliver a plausible impersonation.
 
Alexa’s voice is apparently one that appeals to a very large number of consumers:  A 2021 Statista study showed that Alexa was the most widely used assistant across four of six age demographics. So, why would Amazon want to mess with the sound that’s helped it sell so many smart speakers?
 
According to Amazon senior vice president Rohit Prasad, the change “is about making memories last,” particularly remembrances of those who’ve passed.
 
In many ways that motive makes the voice mimicking technology seem like a great idea.  For those who have lost loved ones, one of the greatest blessings would be to hear their dearly departed’s voice again.
 
Since my father passed away last August, I’ve thought several times how nice it would be to talk with him again—to hear his opinion about the latest news, to ask him questions that only he could answer.
 
On a lighter side and also related to Alexa’s voice imitation, I’ve always enjoyed good impressionists.  It’s fun to hear comedians who can act and sound like famous people.  One of my favorites is Frank Caliendo, who is best known for impressions of famous sports figures; his John Madden and Charles Barkley impressions are great!
 

Frank Caliendo impersonating John Madden on the Late Show with David Letterman
 
So, I can see why Alexa doing impressions of people we knew and loved could be popular.  However, AI impersonations should also give us pause for at least four reasons:
 
1.  More than a voice:  Of course, just because someone, or something, sounds like a person we know, doesn’t mean they are that person.  Every individual is a unique curation of beliefs, affections, and experiences that influence what they say and even how they say things.
 
Frank Caliendo may sound like Charles Barkley, but he obviously isn’t the NBA legend and popular sports broadcaster.  Consequently, Caliendo can never truly say what Barkley would say and neither can AI.  Only a person knows what they themself would say.
 
2.  Respect for the deceased:  Per the previous point, if AI speaks for anyone, beyond playing back a recording of them speaking, it’s putting words in that person’s mouth.  A living person could conceivably give such permission, but how would a dead person do the same, short of adding some kind of addendum to their last will and testament, allowing AI impersonation?
 
I’m not sure it would be fair to ask anyone before their passing to give a smart speaker carte blanche use of their voice.  As hard as it is to let go of people we loved, it’s something we must do.  The longer we’d allow AI to speak for a loved one, the greater the probability that the technology would say things to tarnish their memory.
 
3.  Vulnerable consumers:  Given how good machines already are at imitating life, it will likely become increasingly easy for techno fakes to fool us.  However, there are certain groups of people who are at much greater risk of being duped than the average individual, namely children and older people.
 
It’s scary to think how those with heinous motives might use AI voice imitation to make young children believe they’re hearing the words of a trusted parent, grandparent, etc.  Similarly, the Mindful Marketing article, “Preying on Older People” described how senior citizens are already frequent targets of phone scammers pretending to be someone they’re not.  AI voice imitation could open the flood gates for such abuse.
 
4.  Distorting the truth:  Thanks to fake news, native advertising, deepfake video and the like, the line between what’s real and what’s not is becoming more and more difficult to discern.  University of Maryland professor of psychology Arie Kruglanski warns that a truthless future is not a sustainable one:
 
“Voluminous research in psychology, my own field of study, has shown that the idea of truth is key to humans interacting normally with the world and other people in it. Humans need to believe that there is truth in order to maintain relationships, institutions and society.”
 
“In the extreme, a lost sense of reality is a defining feature of psychosis, a major mental illness.  A society that has lost its shared reality is also unwell.”
 
While examples of the innovation in imitation are fascinating, it’s concerning that in the not-too-distant future, fakes may become undetectable.  At that point, it seems like our world will be well on the path to what Kruglanski  forewarned: ‘losing its sense of reality’ and becoming ‘unwell.’
 
In the 1994 movie Speed, Sandra Bullock and Keanu Reeves try to stop a city bus that’s triggered to explode if it drops below 50 mph.  AI deception can feel like that runaway bus, barreling forward with no way to stop it or even slow it down.
 
However, large corporations like Amazon share the driver’s seat and have some control over the AI vehicle.  Although having them put the brakes on innovation may be too much to ask, they can at least integrate some forms of notification to clearly indicate when people are seeing/hearing a fake and not the real thing.
 
Even with such notifications, Alexa’s application of voice impersonation is wrought with potential for abuse.  For the four reasons outlined above, Amazon should shutter plans for its smart speaker to imitate people and thereby avoid talk of “Single-Minded Marketing.”


Picture
Subscribe to Mindful Matters blog.
Learn more about the Mindful Matrix.
Check out Mindful Marketing Ads
 and Vote your Mind!
2 Comments
Roger Sanford link
7/3/2022 07:50:54 pm

Mindful is the key here. Is Amazon that kind of company? This solution has the ability to do good in the world IF it is used MIndfully. The points you raise are worthy of debate!

Reply
Johnny Mcbride
9/15/2022 11:43:59 am

This form of AI drives forward the question whether we will allow AI to continue its assimilation into our daily lives or if we will resist the onslaught of different AI empowerment. This kind of technology assuredly has positive uses but it would seem that the positives can't outweigh the possible negatives as we look at the ways voice imitation may be detrimental. First and foremost in my mind would be for the purpose of defaming and attacking enemies by saying damaging things under the facade of an AI voice. Such attacks would be difficult to debunk as few things are more reliable than the tone of one's own voice, until now that is. This and the other points brought up in the post make it seem quite single minded to allow such tech to continue.

Reply



Leave a Reply.

    Subscribe to receive this blog by email

    Editor

    David Hagenbuch,
    founder of
    Mindful Marketing    & author of Honorable Influence

    Archives

    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014

    Categories

    All
    + Decency
    + Fairness
    Honesty7883a9b09e
    * Mindful
    Mindless33703c5669
    > Place
    Price5d70aa2269
    > Product
    Promotion37eb4ea826
    Respect170bbeec51
    Simple Minded
    Single Minded2c3169a786
    + Stewardship

    RSS Feed

    Share this blog:

    Subscribe to
    Mindful Matters
    blog by email


    Illuminating
    ​Marketing Ethics ​

    Encouraging
    ​Ethical Marketing  ​


    Copyright 2020
    David Hagenbuch

Proudly powered by Weebly