Parents have launched legal action against social media giant TikTok this week after seven children died attempting one of the latest extreme “challenges” to go viral on the platform.
The “blackout challenge” is an enticing name for an extremely dangerous trend. It encourages TikTok users to post videos choking themselves until they pass out. TikTok knew that videos promoting the challenge were available on the app – after a ten-year-old died back in December attempting the challenge, the social media company vowed to “remain vigilant in our commitment to user safety” and “immediately remove related content if found.”
The new lawsuit, filed last week in California, alleges that no such action was taken – a decision that directly led to the deaths of seven children throughout 2021. Parents of two of those children – both girls, aged eight and ten years old – have filed a suit against the company.
“TikTok needs to be held accountable for pushing deadly content to these two young girls,” Matthew P. Bergman told Ars Technica. He’s the founding attorney of the Social Media Victims Law Center, a private law firm created to hold social media companies accountable for harming children, and he’s also one of the attorneys on the parents’ legal team.
“TikTok has invested billions of dollars to intentionally design products that push dangerous content that it knows are dangerous and can result in the deaths of its users,” Bergman added.
While TikTok has declined to comment on ongoing legal issues, a spokesperson for the company directed the New York Times to a statement made after the death of Nyla Anderson – at least the fourth child to die attempting the challenge – at the end of last year. In it, the company labels the blackout challenge “disturbing,” but denies that it was “[ever] a TikTok trend,” saying that it pre-dates the platform.
The lawsuit rejects that claim, asserting that the company not only failed to protect its underage users from dangerous content but actively promoted it to them. While it’s the Blackout Challenge that’s prompted the suit, the case references over 20 other dangerous TikTok trends which have endangered users in recent years.
“TikTok purports to have a minimum age requirement of 13-years-old but does little to verify user’s age or enforce its age limitations despite having actual knowledge that use by underage users is widespread,” the claim says. “TikTok knows that hundreds of thousands of children as young as six years old are currently using its social media product but undertakes no attempt to identify such users and terminate their usage.”
“Small children [are] available to predatory TikTok users in a manner that actively interferes with parental oversight and involvement and puts them in an inherently vulnerable and dangerous position,” it adds.
While legal action against social media sites has often been fraught with free speech issues in the past, no such problem exists in this case. That’s because the parents aren’t suing TikTok for its content, but its design: the company “could manifestly fulfill its legal duty to design a reasonably safe social product and furnish adequate warnings of foreseeable dangers […] without altering, deleting, or modifying the content of a single third-party post or communication,” the lawsuit says.
Basically, the suit claims that because it promotes dangerous videos to children, TikTok’s algorithm and safety features are a faulty product – and the complaint is therefore one of product liability, negligence, and violation of the California Consumer Legal Remedies Act.
“None of Plaintiffs’ claims […] treat TikTok as the speaker or publisher of content posted by third parties,” the suit explains. “Rather, Plaintiffs seek to hold TikTok liable for its own speech and its own silence in failing to warn of foreseeable dangers arising from anticipate use of its social media product.”
While the outcome of the suit won’t be known for some time, the American Academy of Pediatrics’ Council on Injury, Violence and Poison Prevention recommends that parents take their own steps to limit the risk of this happening again.
“Elementary-school-aged children do not have the knowledge or the insight to realize that these are dangerous things to do,” Council chair Lois Lee told the New York Times. When they see a high number of likes or shares, they may think a challenge is fun or safe to try, she said, and advised parents to monitor their children’s social media use and limit screen time.
“TikTok unquestionably knew that the deadly Blackout Challenge was spreading through their app and that their algorithm was specifically feeding the Blackout Challenge to children, including those who have died,” claims the bereaved parents’ suit.
“TikTok knew or should have known that failing to take immediate and significant action to extinguish the spread of the deadly Blackout Challenge would result in more injuries and deaths, especially among children,” they continue.
“TikTok prioritized greater corporate profits over the health and safety of its users and, specifically, the health and safety of vulnerable children TikTok knew or should have known were actively using its social media product.”