author of Honorable Influence - founder of Mindful Marketing -
author of Mindful Marketing: Business Ethics that Stick
Among the worst practices in marketing I’ve discussed over the years, two that immediately come to mind are Ernst & Young (EY) encouraging its employees to cheat on ethics exams (Cultures of Corruption, July 16, 2022) and Volkswagen integrating a “defeat device” in certain cars in order to trick vehicle emissions readers (Dirty Diesel was No Accident, September 26, 2015). While EY’s behavior was deplorable because of its utter irony, VW’s actions involved painstakingly planned manipulation, the likes of which is seldom seen.
However, neither of these approaches is any more appalling than the newest encroachment on moral sensibility: nudify apps.
What are nudify apps? Kerry Gallagher, the education director for ConnectSafely, as well as a school administrator, a teacher, and a mom of two, succinctly describes them as apps that “take a regular clothed photo of a person and use artificial intelligence to create a fake nude image.”
Although using a nudify app to create such images should alone seem improper, what makes matters worse is that the apps’ users routinely share the fake photos with others, often teens as young as middle school, who then use the deepfake photos to harass and humiliate classmates.
The most infamous case of such shaming occurred in June 2024 in Australia where deep-faked nude images of about 50 girls in two private schools were widely distributed. The perpetrator was a male student, formerly of one of the schools.
As one can imagine, the victims of nudify apps, who are often the last to know what’s been done, are devasted. The National Center for Missing and Exploited Children (NCMEC) is “deeply concerned about the potential for generative artificial intelligence to be used in ways that sexually exploit and harm children.” More specifically, NCMEC issues a stern warning about the damage nudify apps do:
“These manipulative creations can cause tremendous harm to children, including harassment, future exploitation, fear, shame, and emotional distress. Even when exploitative images are entirely fabricated, the harm to children and their families is very real.”
It might seem that creating a fake nude image of someone would clearly be illegal, but as often happens with new technology, laws lag behind individuals’ and organizations’ actions. In the United States, a provision in the Violence Against Women Reauthorization Act of 2022 made the sharing of intimate images without consent grounds for civil action in federal court, but if the images shared are fakes, i.e., not real explicit images, has the civil law truly been broken?
Regardless of that potential legal loophole, using nudify apps legally doesn’t mean doing so is ethical.
The significant psychological and social harms the images cause their victims are certainly moral concerns. However, such negative outcomes aren’t the only ethical grounds on which nudify apps should be judged. The behavior also violates at least two time-tested values:
- Fairness: Every person has rights to privacy, including for their body. Even though they are not actual photographs, the images that nudify apps create look “hyper-realistic” because the algorithms that create them have been trained on “large datasets of explicit images,” which produces for viewers the effect that they are actually seeing the victim naked. It’s unfair to have the right to physical modesty ‘stripped away’ without consent.
- Decency: The human body is a beautiful thing, not inherently indecent. However, over millennia, most cultures have adopted rational norms that limit physical exposure in public by prescribing what people should wear, from loincloths to leggings. Many societies have codified their norms into laws aimed at guiding behavior, like statutes against public indecency and the Motion Picture Association’s film rating system (PG-13, R, etc.). The point is, abundant precedent suggests that the primary end of nudify apps, to indiscriminately publicize human nakedness, including among minors, is fundamentally indecent.
So far the focus of this article has been on the users of nudify apps, who are certainly culpable for their shameful acts. At the same time, when the perpetrators are themselves children, it’s especially important to ask: Who else should bear responsibility? Those accountable should include:
- Parents: Although it’s impossible to monitor everything one’s kids do on their laptops and phones, parents must establish at least some safety limits. Moreover, parents should model and discuss appropriate behaviors more broadly so their children assimilate values that will positively guide their daily choices.
- Institutions: Schools should be proactive in addressing nudify apps with their students, letting them know that the apps are off-limits and warning students of the consequences for violations.
- Government: Legislatures at all levels should consider how then can limit if not eliminate nudify apps. Some states like New Jersey are making the use of nudify apps a criminal offense.
- Associations: For the benefit of their fields, professional groups can take stands against nudify apps specifically, and more generally they should clearly the communicate the values of fairness and decency that are fundamental to rejecting the apps, as well as future technology based on similar impropriety.
There’s one other set of responsible parties not mentioned above because they deserve accountability above any other – the apps’ creators.
It’s hard to imagine how the dozens of marketers of nudify apps justify their products. Maybe some rationalize, “They’re for people to nudify themselves,” but who needs to do that? In most imaginable instances, the apps’ purpose is to undress others without their knowledge or consent, then to share the sordid deepfakes with others.
As often happens in cases where business strategy goes awry, money has likely overshadowed any plausible mission for the creators of nudify apps and woefully skewed the tech entrepreneurs’ ambitions. Likewise, the apps’ creators seemingly failed to self-censure, or follow the moral mandate, Just because we can doesn’t mean we should.
One entity that can’t reasonably be held responsible is AI. Artificial intelligence is basically a value-neutral tool, often used for good purposes but sometimes for nefarious ones, as nudify apps illustrate. AI largely does what it’s told to do without questioning the ethicality of the instructions, which is the obligation of people.
As I’ve found through my own experiences using AI and as the following articles expound, it’s up to humans to hit pause when potential ethical issues arise and to ask the moral question, “Is this something we should be doing?”
- Who will be the Adult in the Room with AI?
- What Sales AI Can and Can't Do
- Questions are the Key to AI and Ethics
Abominable, egregious, heinous, indefensible, reprehensible – maybe all these adjectives are needed to adequately describe the destructive nature of nudify apps. One other descriptor that should be included is Single-Minded Marketing.
Learn more about the Mindful Matrix.
Check out the book, Mindful Marketing: Business Ethics that Stick