TikTok Must Face Lawsuit Over 10-Year-Old Girl’s Death, US Court Rules

A U.S. appeals court has revived a lawsuit against TikTok filed by the mother of a 10-year-old girl who tragically died after participating in a viral “blackout challenge” on the platform. The challenge, which dared users to choke themselves until they passed out, was allegedly recommended to the girl by TikTok’s algorithm.

Although federal law typically protects internet companies from lawsuits over content posted by users, the Philadelphia-based 3rd U.S. Circuit Court of Appeals ruled on Tuesday that this protection does not extend to claims that TikTok’s algorithm recommended the harmful challenge to Nylah Anderson, the young girl who died.

U.S. Circuit Judge Patty Shwartz, writing for the three-judge panel, clarified that Section 230 of the Communications Decency Act of 1996 only provides immunity for information supplied by third parties, not for recommendations made by TikTok through its algorithm. She acknowledged that this ruling deviates from past court decisions, which have often held that Section 230 shields online platforms from liability for failing to prevent the transmission of harmful content by users.

However, Judge Shwartz noted that this reasoning no longer applies following a U.S. Supreme Court ruling in July. That ruling addressed whether state laws restricting the power of social media platforms to censor content they find objectionable infringe upon the platforms’ free speech rights. The Supreme Court determined that a platform’s algorithm reflects “editorial judgments” about how it compiles and presents third-party content, thereby constituting speech by the company itself. Consequently, content curation via algorithms is considered first-party speech by the company, which Section 230 does not protect.

“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” Judge Shwartz wrote.

TikTok did not respond to requests for comment.

The ruling overturned a lower-court decision that had dismissed the lawsuit on Section 230 grounds. Tawainna Anderson, Nylah’s mother, sued TikTok and its Chinese parent company ByteDance after her daughter died in 2021 while attempting the blackout challenge using a purse strap in her mother’s closet.

“Big Tech just lost its ‘get-out-of-jail-free card,’” said Jeffrey Goodman, the attorney representing Nylah’s mother, in a statement.

In a partially concurring opinion, U.S. Circuit Judge Paul Matey criticized TikTok for prioritizing profits over other values, stating that the platform may choose to serve children content that appeals to “the basest tastes” and “lowest virtues.” However, he emphasized that TikTok “cannot claim immunity that Congress did not provide.”