A 10-year-old girl died during the viral ‘blackout challenge.’ Now TikTok could be held liable

A 10-year-old girl died during the viral ‘blackout challenge.’ Now TikTok could be held liable

A 10-year-old girl died during the viral ‘blackout challenge.’ Now TikTok could be held liable

TikTok “challenges” have been blamed for grievous spinal injuries horrific burns , and, at one point in the not-too-distant past, a global shortage of Ozempic .

Now, the world’s second most popular social media app will be forced to explain itself as it battles accusations that it knowingly steered potentially dangerous videos to young people while ignoring warnings of death and destruction in an attempt to maximize profits. An opinion handed down Tuesday by a US appeals court breathed new life into a momentous lawsuit filed by a grieving mother who believes TikTok is culpable for the loss of her daughter.

Federal law “provides TikTok immunity from suit for hosting videos created and uploaded by third parties,” US Circuit Judge Paul Matey wrote in his partial concurrence. At the same time, it does not protect TikTok from being held to account for what Matey described as its alleged “knowing distribution and targeted recommendation of videos it knew could be harmful.”

In a statement provided on Wednesday to The Independent , attorney Jeffrey Goodman, who argued the appeal on behalf of the Anderson family, said, “Big Tech just lost its ‘get-out-of-jail-free’ card. This ruling ensures that the powerful social media companies will have to play by the same rules as all other corporations, and when they indifferently cause harm to children, they will face their day in court.”

TikTok has said that the “ safety of users is our top priority .”

https://img.particlenews.com/image.php?url=0tmAqC_0vD8qDff00
Nylah Anderson’s mom, seen here at a press conference held after she filed a lawsuit against TikTok in 2022, wants the company to stop feeding dangerous content to impressionable kids. (Screengrab Saltz Mongeluzzi & Bendesky)

It was the TikTok-infamous “Blackout Challenge” that killed 10-year-old Nylah Anderson , according to her mom. In December 2021, Philadelphia-area resident Tawainna Anderson discovered her daughter’s lifeless body on the bedroom floor, following the “active, happy, healthy, and incredibly intelligent” girl’s attempt to complete the challenge, in which users of the world’s second most popular non-gaming app choked themselves with belts, towels, or other objects until they passed out. Nylah was rushed to the hospital, where she died five days later.

Over an 18-month span between 2021 and 2022, the Blackout Challenge was said to have killed 20 children in 18 months , 15 of them under the age of 12. And it’s hardly the only dangerous “challenge” to go viral. The “Skullbreaker Challenge,” which involves kicking a person’s legs out from under them as they jump into the air, nearly paralyzed a 13-year-old Pennsylvania girl, and has led to criminal charges in at least one instance. The so-called Angel of Death Challenge, a “game” in which participants jump in front of moving vehicles to see if they’ll stop in time, has reportedly led to multiple deaths . And when a 12-year-old Arizona boy attempted the TikTok “Fire Challenge,” where youngsters record themselves igniting blazes at home, he landed in the ICU and has undergone multiple surgeries since.

https://img.particlenews.com/image.php?url=3jPCKF_0vD8qDff00
Nylah Anderson’s mom blames TikTok for her death. (US District Court for the Eastern District of Pennsylvania)

In May 2022, Anderson sued TikTok and Chinese parent company ByteDance, arguing they were aware of the Blackout Challenge and its dangers but continued to purposefully deliver the videos to young, impressionable kids. The Blackout Challenge had shown up on Nylah’s “For You” page, the lawsuit alleged.

“I cannot stop replaying that day in my head,” Anderson said at a news conference shortly after bringing the lawsuit. “It is time that these dangerous challenges come to an end so that other families don’t experience the heartbreak we live [with] every day.”

TikTok executives “unquestionably knew that the deadly Blackout Challenge was spreading through its app and that its algorithm was specifically feeding the Blackout Challenge to children, including those who had died,” Anderson’s original complaint stated. Following Nylah’s death, TikTok told The Independent that it was instituting controls to protect minors from content that could be harmful. The company also said it had “no evidence” of a “Blackout Challenge” on its site.

https://img.particlenews.com/image.php?url=2qIOTJ_0vD8qDff00
TikTok is the world’s second most popular app, behind only Instagram (Getty Images)

However, in October 2022, a federal judge threw out the case, ruling that TikTok was shielded under an arcane — but hotly debated — portion of a 1996 law that former President Donald Trump tried to do away with after his tweets started being labeled as misinformation.

“Although the circumstances here are tragic, I am compelled to rule that because Plaintiff seeks to hold Defendants liable as ‘publishers’ of third-party content, they are immune under [Section 230 of] the Communications Decency Act,” US District Judge Paul Diamond wrote in his dismissal.

On appeal, Anderson’s attorneys argued that while TikTok did not create the videos in question, the company “took no and/or completely inadequate action to extinguish and prevent the spread of the Blackout Challenge and specifically to prevent the Blackout Challenge from being shown to children... Instead, TikTok continued to recommend these videos to children like Nylah.”

Tuesday’s decision by the United States Court of Appeals for the Third Circuit allows Anderson’s lawsuit to once again proceed, with TikTok and ByteDance named as defendants.

In it, Matey, a former white collar criminal defense lawyer who Trump appointed to the bench in 1996, noted that Nylah was “still in the first year of her adolescence, [and] likely had no idea what she was doing or that following along with the images on her screen would kill her.”

In his own statement, Goodman’s co-counsel Samuel Dordick, said, “For decades, Big Tech companies like TikTok have used Section 230 to protect them from accountability for their egregious and predatory conduct. This resounding ruling has decisively stated Section 230 does not extend that far.”

Anderson, for her part, issued a statement through Goodman and Dordick, saying, “Nothing will bring back our beautiful baby girl. But we are comforted knowing that — by holding TikTok accountable — our tragedy may help other families avoid future, unimaginable suffering.”