Mindful Marketing
  • Home
  • About
    • Mission
    • Mindful Meter & Matrix
    • Leadership
  • Mindful Matters Blog
  • Engage Your Mind
    • Mindful Ads? Vote Your Mind!
  • Expand Your Mind
  • Contact

Why Can't TikTok Block the Blackout Challenge?

1/1/2022

10 Comments

 
Picture

by David Hagenbuch - professor of Marketing at Messiah University -
​author of 
Honorable Influence - founder of Mindful Marketing 

Many people’s New Year’s resolutions are to eat less and exercise more.  Fortunately, few people need to promise to kill less.  That goal, though, may be a good one for the world’s-fastest growing social media platform in order to better protect the lives of young users who are oblivious to the dangerous game they’re playing.
 
Nyla Anderson was a “happy child” and “smart as a whip”—she even spoke three languages. Tragically, the 10-year-old Pennsylvania girl’s life was cut short on December 12, when she died while attempting a perilous social media trend called the Blackout Challenge.
 
The Blackout Challenge “requires the participant to choke themselves until they pass out and wake up moments later.”  Sadly, some who participate, like Nyla, never wake up, and if they don’t die, they may suffer seizures and/or brain damage.
 
It’s tragic, but young people likely have engaged in foolhardy, life-threatening behavior since the beginning of humankind.  Within a few years of my high school graduation, two of my classmates lost their lives in separate car crashes caused by high-speed, reckless driving.  Most people probably can share similar stories of people they knew who needlessly died too young.   
 
In some ways it’s inevitable that young people’s propensity for risk-taking paired with a limited sense of their own mortality will lead them to endanger themselves and encourage others to do the same.  What’s inexplicable is how older and presumably more rational adults can encourage and even monetize such behavior, which is what some suggest TikTok has done.
 
Unfortunately, Nyla is not the only young person to pass away while attempting the Blackout Challenge.  Other lives the ill-advised trend has taken include 12-year-old Joshua Haileyesus of Colorado and 10-year-old Antonella Sicomero of Palermo, Italy.  TikTok provided the impetus for each of these children to attempt the challenge.
 
Most of us know from experience that peer influence can cause people to do unexpected and sometimes irrational things.  In centuries gone by, that influence was limited to direct interpersonal contact and then to traditional mass media like television.  Now, thanks to apps like TikTok, anyone with a smartphone holds potential peer pressure from people around the world in the palm of their hand.
 
In TikTok’s defense, the Blackout Challenge predates the social media platform.  ByteDance released TikTok, or Douyin as it’s known in China, in September of 2016.  Children had been attempting essentially the same asphyxiation games, like the Choking Challenge and the Pass-out Challenge, many years prior.  In fact, the Centers for Disease Control and Prevention (CDC) reported that 82 children, aged 6 to 19, likely died from such games between 1995 and 2007.

It’s also worth noting that individuals and other organizations create the seemingly infinite array of videos that appear on the platform.  ByteDance doesn’t make them, it just curates the clips according to each viewer’s tastes using one of the world’s most sophisticated and closely guarded algorithms.
 
So, if TikTok didn’t begin the Blackout Challenge and it hasn’t created any of the videos that encourage it, why should the app bear responsibility for the deaths of Nyla, Joshua, Antonella, or any other young person who has attempted the dangerous social media trend?
 
It’s reasonable to suggest that TikTok is culpable for the self-destructive behavior that happens on its premises.  A metaphor might be a property owner who makes his house available as a hangout for underage drinking.  The homeowner certainly didn’t invent alcohol, and he may not be the one providing it, but if he knowingly enables the consumption, he could be legally responsible for “contributing to the delinquency of a minor.”
​
Picture

By hosting Blackout Challenge posts, TikTok could be contributing to the delinquency of minors.
 
I have to pause here to note an uncomfortable irony.  Less than four months ago, just after Francis Haugen blew the whistle on her former employer Facebook,  I wrote a piece titled “Two Lessons TikTok can Teach Facebook.”  In the article, I described specific measures TikTok had taken to, of all things: 1) discourage bad behavior, and 2) support users’ mental health.
 
How could I have been so wrong?  Although I certainly may have been misguided—it wouldn’t be the first time—TikTok’s actions that I cited truly were good things.  So, maybe the social media giant deserves to defend itself against the new allegations.
 
TikTok declined CBS News’ request for an  interview, but it did claim to block content connected to the Blackout Challenge, including hashtags and phrases.  It also offered this statement, “TikTok has taken industry-first steps to protect teens and promote age-appropriate experiences, including strong default privacy settings for minors."
 
The notion of protecting teens is certainly good; however, it’s hard to know what “industry-first steps” are.  Furthermore, prioritizing age-appropriateness and privacy are important, but neither objective aligns particularly well with the need to avoid physical harm—the main problem of the Blackout Challenge.
 
In that spirt and in response to accusations surrounding Nyla’s death, TikTok offered to Newsweek a second set of statements:
 
“We do not allow content that encourages, promotes, or glorifies dangerous behavior that might lead to injury, and our teams work diligently to identify and remove content that violates our policies.”
 
"While we have not currently found evidence of content on our platform that might have encouraged such an incident off-platform, we will continue to monitor closely as part of our continuous commitment to keep our community safe. We will also assist the relevant authorities with their investigation as appropriate."
 
These corporate responses do align better with the risks the Blackout Challenge represents.  However, there’s still a disconnect:  TikTok claims it’s done nothing to facilitate the Blackout Challenge, but family members of those lost say the social media platform is exactly where their children encountered the fatal trend.
 
The three families’ tragedies are somewhat unique, but they’re far from the only cases of people seeing the Blackout Challenge on TikTok and posting their own attempts on the app.  TikTok has taken measures that have likely helped ‘lessen the destruction,’ but it’s unreasonable for it to claim exoneration. 
 
The company’s app must be culpable to some degree, but what exactly could it have done to avoid death and injury?  That question is very difficult for anyone outside TikTok or without significant industry expertise to answer; however, let me ask one semi-educated question—Couldn't TikTok use an algorithm?
 
As I’ve described in an earlier blog post, “Too Attached to an App,” ByteDance has created one of the world’s most advanced artificial intelligence tools—one that with extreme acuity serves app users a highly-customized selection of videos that can keep viewers engaged indefinitely.
 
Why can’t TikTok employ the same algorithm, or a variation of it, to keep the Blackout Challenge and other destructive videos from ever seeing the light of day?
 
TikTok is adept at showing users exactly what they want to see, so why can’t it use the same advanced analytics with equal effectiveness to ‘black out’ content that no one should consume?
 
The truism ‘nobody’s perfect’ aptly suggests that every person is, in a manner of speaking, part sinner and part saint.  TikTok and other organizations, which are collections of individuals, are no different, doing some things wrong and other things right but hopefully always striving for less of the former and more of the latter.
 
Based on its statements, TikTok likely has done some ‘right things’ that have helped buffer the Blackout Challenge.  However, given the cutting-edge technology the company has at its disposal, it could be doing more to mitigate the devastating impact.  For that reason, TikTok remains responsible for “Single-Minded Marketing.”
​
Picture
Subscribe to Mindful Matters blog.
Learn more about the Mindful Matrix.
Check out Mindful Marketing Ads
 and Vote your Mind!
10 Comments
Sarah Clemson
2/4/2022 04:36:30 pm

I do not believe that TikTok should necessarily be held responsible for the deaths of these children, but I do think the platform could do a better job at monitoring children's activity on the app. Firstly, there should be an age limit for children to join the platform. Yes, children can lie about their age, but if they're obviously younger than they say they are, TikTok moderators can figure that out.
TikTok moderators have not been great at removing content that violates their community guidelines. Despite receiving reports that certain videos are not fit to be on the app, TikTok often allows them to stay online anyway.
Ultimately it is the parents' responsibility to ensure that their children are using the Internet safely. Parents should discuss Internet safety with their children regularly and should warn them against dangerous social media trends such as the TidePod Challenge and this Blackout Challenge.

Reply
Julia Mary Register
2/5/2022 06:48:46 pm

I'm a little shocked that TikTok did not put a stop to the Blackout Challenge when multiple people started posting them. If TikTok is about hosting a safe space for children and teens, they should not be allowing content like this. I completely understand why the parents of those children blame the app for hosting the challenge. However, I also think parents with kids that young shouldn't let them be on social media like TikTok.

From a publicity standpoint, I'm not sure TikTok was helping their reputation by denying any evidence of the challenge on their app. I think it would have been more helpful to admit that TikTok has caused harm and promise to do better. Also, I'm sure this hasn't been the first challenge to cause harm to minors. Why hasn't TikTok and other social media apps worked to create algorithms to shut these down?

Reply
Faith Harlow
2/6/2022 03:07:22 pm

I hadn’t even heard about the blackout challenge until reading this article- I also don’t have TikTok but I t think I would have heard about this matter if it was being addressed as a priority to the company and they choose to really take a stand on it. I feel morally as though TikTok should have some sort of responsibility in the deaths from this challenge- as far as allowing the circulation of the trend to continue- yet there are probably many more situations like this from other happenings on the app. There can be many things that trigger self-harm, and with almost anything being able to be on the internet it I can see why TikTok itself can’t take the full blow. However, thinking about of apps like Instagram that easily tag posts with warnings or remove them for certain content, I think finding a solution to minimize harmful trends like this can happen better. What resonated with me the most was reading about the victims who 10-12 years were old. This age group is so easily and heavily influenced by others, and if things like this are seen as "trendy" it is not wise for them to be around that. Like others have said before parents should also have some sort of responsibility for allowing younger kids to be apart the community on the app- yet it is on TikTok as an organization to advertise themselves to the appropriate audience- which according to the app store is 12+. All in all, no matter the age, a harmful ‘trend’ like this is not purposeful and therefore should not be tolerated by TikTok as entertainment or just brushed aside.

Reply
Kara Wiegel
2/6/2022 07:18:43 pm

I had never heard of the Blackout Challenge before this post. I have heard of another popular challenge that was supposedly dangerous called Red Door Yellow Door. However, for TikTok to promote or allow this on their platform (even if they claim otherwise) is harmful to young kids, teens, and young adults. Their system on what they decide is harmful, dangerous, or wrong does not make sense.
Unfortunately, three children have died from this challenge, and TikTok claims that there wasn’t anything on their platform to cause this. What measures has TikTok taken to help with ‘less destructive behaviors’? This content has popped up and continues to, even after TikTok claims to have ‘fixed it.’ TikTok should do what Google does if you search anything self-destructive and give a helpline phone number. This is ridiculous for a company that knows its audience (almost too well sometimes) and for them not to have any noticeable solutions besides just fixing their algorithms and what gets posted.

Reply
Livy Crocenzi
2/6/2022 11:40:50 pm

I think that TikTok should have immediately put a stop to the Blackout Challenge as soon as the first child passed away. TikTok needs to also improve their moderation of the app, as there have been plenty of videos that violate guidelines, and yet TikTok looks the other way from them. They should have made an effort of getting rid of any videos that encouraged the challenge, instead of denying that there was evidence of the challenge on the app.
I do believe that TikTok should be held morally responsible for the deaths of the children because of their lazy way of preventing more deaths of children from the challenge. However, parents of young children should be more aware of what is going around on the Internet, and moderate what their children are watching.

Reply
Victoria Slotter
2/7/2022 10:03:18 am

I for one was horrified reading this article. It first started by discussion how intelligent and gifted Nyla Anderson was (deceased by blackout trend). I just had several classes pertaining to adolescent development and we learned how adolescents are quite literally incapable of making rational decision. Their prefrontal cortex is not fully developed until early/mid adulthood. As adult, we are tasked with understanding that they are literally unable to make rational decisions as we can and are required to make a safe space for these individuals, especially if we own a platform that millions of adolescents are subscribed to. I believe this is a major fault of tiktok for not monitoring the trend more closely. I also saw that someone said that tiktok should have gotten rid of the challenge after the first child died but I argue that the challenge should barely have been present enough on the app before Tiktok banned or removed the videos. I did look up "blackout challenge" on Tiktok and the first thing that popped up was "Learn how to recognize harmful challenges and hoaxes". I have just recently rejoined this platform and have already saw multiple creators being shadowbanned or having their content removed for "violating community guidelines". I would also like to note that the majority of videos being taken down are political or siding with one political party. If you look it up, Tiktok tends to shadowban content creators of color more so than their white counterparts which is incredibly problematic. If Tiktok can monitor people of color as well as political videos and have them taken down at an alarmingly fast rate, they should have been able to get onto this "blackout challenge" (regardless if Tiktok facilitated the trend or not) before there were any fatalities due to the app.

Reply
Xander Duerksen
2/8/2022 06:45:21 pm

I remember hearing about this absurd trend prior to reading this article, which is why this particular article interested me. I was unaware that very similar trends existed before platforms like TikTok, and that honestly came as a surprise to me. However, I think this partially stems from TikTok being social media. It makes issues like this much less isolated, even bringing them into the spotlight. While children being unable to make rational decisions is certainly nothing new, TikTok's potent influence over children certainly is and I have no doubt that TikTok is (inadvertently or not) creating a lot of issues that would otherwise have been non-existent or isolated.

Additionally, I actually have a friend who is studying communications and who recently did a large research project on the impact of social medias such as TikTok on children, and what he found was frankly disturbing, as the research he did revealed that apps like this not only influence the way these kids develop, but even mold the way they do to a higher degree than one might expect.

To conclude, I'm not necessarily certain of TikTok's responsibility for the unfortunate deaths of these children. However, platforms like TikTok need to be cognizant of the immense impact they have on a younger audience, and address it appropriately.

Reply
Nicole Shank
2/8/2022 10:42:44 pm

I do not think that TikTok should necessarily be held responsible for the deaths of these children, but I do think the platform should have done a better job at monitoring children's activity on the app. I also think that it is important to note that this is not only a problem with TikTok, but with many other social media apps including Snapchat, YouTube, Instagram, among others. Social media apps need to do a much better job in not only handling situations like these, but also be able to prevent them more effectively.
TikTok moderators have not done well with removing content that violates their community guidelines. Even after they receive reports that say these certain videos are not fit to be on the app, TikTok often allows them to stay online anyway. This content can then be seen by young children, who may not fully understand the risks of these dangerous trends.
On the other hand, social media apps are not the only ones to be blamed. I believe that monitoring children's use on the Internet begins with the parents, and it is their responsibility to make sure that their kids are being safe online. Parents should make their children aware of how to use the Internet in a safe manner and what sites and content are not appropriate for them.

Reply
Tyler Riley
2/9/2022 01:41:13 pm

I think that TikTok could employ certain age restrictions and even employ a more restrictive algorithm. However, even with these precautions it would be difficult to prevent young children on the app from being exposed to dangerous situations or trends. Social media algorithms will always be criticized for being either too strict or not strict enough. Children are highly impressionable and providing them access to a network of videos from around the world comes with many risks. TikTok should continue to improve algorithms and restrictions to the best of their ability but parents should also take precaution to be aware of the content their children might be exposed to. Parents cannot be blamed for their children's innocent curiosity, but should take every precaution to ensure their children's safety despite what TikTok is able to restrict or not.

Reply
Tyler Whitesel
2/10/2022 09:54:06 am

I had never heard of the blackout challenge before reading this article and if I am to be honest I am incredibly disheartened to hear that such young children lost their lives due to outside influence. If I am being perfectly honest with myself I do not know much about how TikTok is run nor do I know everything that goes on inside of the company. However, if there is a trend that is occuring in which adults and even children are actively hurting themselves to the point where they could die I feel that it is the responsiblity of the one distrubting the trend to at the very least try to shut these actions down. Sure Tiktok did not start the trend nor did they ever do anything blatantly encourging it however they also did not do anything to try and stop it before tragedy struck. The blackout challenge in my opinion should have been attempted to be taken down long before the tragedy of children losing their life occured since even without considering that consequence, it is painfully clear that choking yourself to the point of passing out is not healthy physically or mentally. Now I do not believe that the actual blackout challenge is the only thing Tiktok did wrong, according to this article they completly denied all involvement with this event, when accoring to families they did. I believe they should have at least taken some responsibility for these actions even if it may have hurt them as a company instead of playing it off as if they are innocent. Imagine being the parent of one of the kids that just died and knowing what the reason was but the perpretator outright denied they had anything to do with it? Would that not hurt and cause you to at least be somewhat angry with them? I believe it is important for companies to take responsibilty for their own actions and shortcomings.

Reply



Leave a Reply.

    Subscribe to receive this blog by email

    Editor

    David Hagenbuch,
    founder of
    Mindful Marketing    & author of Honorable Influence

    Archives

    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014

    Categories

    All
    + Decency
    + Fairness
    Honesty7883a9b09e
    * Mindful
    Mindless33703c5669
    > Place
    Price5d70aa2269
    > Product
    Promotion37eb4ea826
    Respect170bbeec51
    Simple Minded
    Single Minded2c3169a786
    + Stewardship

    RSS Feed

    Share this blog:

    Subscribe to
    Mindful Matters
    blog by email


    Illuminating
    ​Marketing Ethics ​

    Encouraging
    ​Ethical Marketing  ​


    Copyright 2020
    David Hagenbuch

Proudly powered by Weebly