In dozens of lawsuits parents blame Meta, TikTok for hooking kids
Plaintiffs in more than 70 lawsuits say Silicon Valley’s algorithms are causing real-world harm
A 16-year-old girl in Utah becomes so obsessed with her body image after getting hooked on Instagram that she develops anorexia and bulimia. A boy from Michigan goes from watching YouTube videos for several hours a day at age 9 to binging all night on TikTok and Snapchat, then ends up sharing a nude photo of himself on Snapchat with a stranger who circulates it widely online. A Connecticut girl struggles for more than two years with an extreme addiction to Instagram and Snapchat before she succumbs to severe sleep deprivation and depression and takes her own life—at age 11. Under the platforms' terms of use, she shouldn't even have had accounts before she turned 13.
These children and others like them are the face of a novel effort to use litigation to pin responsibility for the alleged dangers of social media on the companies that run the most popular platforms. More than 70 lawsuits have been filed this year against Meta, Snap, ByteDance's TikTok, and Google centering on claims from adolescents and young adults who say they've suffered anxiety, depression, eating disorders, and sleeplessness as a result of their addiction to social media. In at least seven cases, the plaintiffs are the parents of children who've died by suicide. The suits make claims of product liability that are new to social media but have echoes of past campaigns against tobacco companies and automobile manufacturers.
The idea that social media companies shoulder responsibility for the potential damage their products cause to young people came to the fore late in 2021 when former Meta Platforms Inc. employee Frances Haugen came forward with documents about its internal operations. Among Haugen's allegations was a claim that the company was knowingly preying on vulnerable young people to boost profits. Haugen revealed an internal study at Meta-owned Instagram that found evidence that many adolescent girls using the photo-sharing app were suffering from depression and anxiety around body-image issues.
At the least, the new legal front presents a public-relations challenge for already-embattled tech companies. Their defense relies on Section 230 of the Communications Decency Act, the 26-year-old federal statute giving internet companies broad immunity from claims over harmful content posted by users. The law has so effectively shielded them from legal claims that voices on both the political left and right have called for its reform. For good measure, the companies also cite their constitutional free-speech rights as publishers to control their content. It may be months before a federal judge rules on whether to let the litigation proceed.
"All sorts of services may be 'addictive' in the habit-forming sense—from television to video games to shopping for clothes—but the law does not impose liability simply for creating an activity which some may do too much, even where, as here, it allegedly results in tragedy," Snap Inc. argued in a court filing seeking dismissal of the case filed by the 11-year-old girl's mother.
The lawsuits are crafted to get around Section 230 by focusing on the algorithms the companies use to curate and deliver content rather than the content itself. Jen King, a research fellow at Stanford's Institute for Human-Centered Artificial Intelligence who studies algorithmic manipulation, says there's been little clinical research into social media addiction. The legal approach may be novel enough, she says, that "it has a greater chance of success, especially after Frances Haugen's whistleblower leak."
Spokespeople for Snap and Meta's Instagram declined to comment on pending litigation. But they said in emails the companies are working to protect their youngest users, including by offering resources on mental health topics and improving safeguards to stop the spread of harmful content. Representatives for TikTok and ByteDance didn't respond to a request for comment. A Google spokesperson said it's "invested heavily in creating safe experiences for children," and cited parents' ability to limit screen time or access to certain content.
In a rare product liability case that survived a fight over Section 230, a federal appeals court in California ruled last year that a lawsuit could proceed over a Snapchat filter that recorded real-time speed and allegedly encouraged reckless driving by teens, resulting in a fatal accident. But Meta has argued in a court filing that the ruling on the speed filter isn't relevant to the case of the 11-year-old girl, because the "danger" to drivers in the Snap case was the speed filter itself, not any content created by its users.
Eric Goldman, a Santa Clara University law professor who's written extensively about Section 230, agrees that the rationale put forth in the Meta case is weak. "If anything, the argument that the algorithms are to blame really highlights what the plaintiffs are suing about is third-party content," he says. Beyond that, Goldman says it'll be difficult to prove that social media use alone is to blame for health issues in today's "complex society," in which kids are exposed to all kinds of influences.
Another hurdle for plaintiffs will be establishing that the algorithms that decide what content social media users see should be treated just like other defective "products," which typically include tangible consumer goods, says Adam Zimmerman, a law professor at Loyola Marymount University. Even if the algorithms do qualify as products, platform users might be barred in some states from suing for "pure emotional harm" if they've suffered no physical injury, he adds.
In the most extreme cases, though, the harm has been more than emotional. Janet Majewski, whose 14-year-old daughter, Emily, killed herself, sued TikTok, as well as ByteDance, Snap, and Meta in August, saying the companies are to blame for excessive screen time that took Emily down a dangerous path.
Majewski, who lives in Grovetown, Ga., recalls checking Emily's phone a week before her death but says she didn't see anything concerning. "As parents, we're not seeing what they're seeing," Majewski says, explaining how children and teens absorb social media differently than adults. The companies' goal "is to get as much time from these kids on their product, but there's no safeguards," she says. "They need to change what they're feeding these children—change the algorithm so it doesn't take them into these dark places."