child using mobile phone, children using mobile
Image Credit: Shutterstock

Young people spend a lot of time on social media despite mounting evidence that it's not good for them, with health experts pointing to harms ranging from lost sleep and eating disorders to suicide. Internal records of companies that run the most popular platforms are the centerpiece of a fast-growing pile of lawsuits that accuse them of knowingly hooking kids before they even reach their teens. The suits face multiple obstacles, starting with the 1996 law that gives internet platforms broad immunity from harmful content posted by users.

READ MORE

1. Who's suing, and why?

Children, adolescents and young adults - sometimes via their parents, siblings or other family members - have sued the companies based on claims of psychological distress, physical impairments and death. Their cases make up the vast majority of 211 lawsuits filed across the US as of April 17 that have been assigned to two judges in California - one in state court in Los Angeles and the other in federal court in Oakland. At least two dozen of the most recent complaints were brought by school districts on behalf of students. The plaintiffs argue that companies including Facebook parent Meta Platforms, Snap, TikTok and Alphabet's Google continue to target children in pursuit of profit despite academic and medical studies showing that this audience is particularly vulnerable to the addictive effects of their platforms while their bodies and minds are still developing.

2. What internal records are they relying on?

The lead case was filed about a year after Frances Haugen, a former product manager-turned-whistleblower at Facebook, revealed internal documents showing the Meta had long known that its platforms - which include Instagram in addition to Facebook - have ill effects on young people, especially girls struggling with body image issues. Records disclosed by TikTok parent ByteDance in response to lawsuits suggest that the company knows young people are more susceptible to being lured into trying dangerous and even deadly stunts they view on the platform - known as viral challenges - because their ability to weigh risk isn't fully formed. Product research at ByteDance concluded that the No. 1 reason identified by teens for participating in the challenges is "getting views/likes/comments," followed by "impressing others online."

3. What are the companies accused of doing?

The specific claims differ by platform but generally allege that the social media giants are borrowing from the same playbook of behavioral and neurobiological techniques used by the gambling and cigarette industries. The companies are accused of designing endless, algorithm-generated feeds to induce young users into a so-called "flow state," in which they react to incessant notifications that manipulate dopamine levels, encourage repetitive account checking, and reward round-the-clock use. And with that use comes the most valuable prize: troves of data about young users' preferences, habits and behaviors that are sold to advertisers. For the companies, the parents say, a child addict today is an adult user tomorrow. The companies say they are offering more resources to keep children safe online and argue that the lawsuits improperly seek to regulate content.

4. What kind of harm do the suits allege?

Addictive use of social media results in an array of psychological disorders, and in extreme cases self-harm and suicide, according to the lawsuits. Some studies have shown that many young people, despite using social media frequently, don't particularly enjoy it. Yet the more they use the apps, the harder they are to quit. The lawsuits also allege that by making minors' profiles public, the platforms are handing detailed background information to sexual predators who can easily discern their friends, activities, interests, and even location. Flawed age verifications and fake accounts exacerbate the problem, parents argue.

5. What's the legal foundation for the claims?

Most lawsuits are built on product liability claims alleging that the platforms are designed defectively and the companies are failing to warn users of the risks of addiction. Similar lines of argument have driven decades of litigation targeting cigarettes, asbestos, medical devices and prescription drugs, with varying degrees of success. In about 15 of the social media cases, companies face wrongful death claims brought on behalf of young people who died by suicide. The suits filed by school districts take a different approach by alleging that the platforms have created a public nuisance. That's similar to lawsuits over the opioid addiction crisis, which so far have resulted in drugmakers, distributors and retailers paying almost $50 billion in court damages and settlements

6. How are the companies defending themselves?

Their first line of defense is Section 230 of the Communications Decency Act, the 27-year-old federal statute that has consistently shielded companies from liability over the comments, ads, pictures and videos on their platforms. The U.S. Supreme Court is currently wrestling with whether to roll back some of that far-reaching protection. Even if it does, there are other hurdles for the addiction cases. It will be difficult to prove that social media use alone is to blame for health issues suffered by kids exposed to a variety of influences and life experiences. Another challenge will be establishing that the algorithms that decide which content a user sees should be treated as defective "products," a word that typically means a tangible consumer good. Some states bar consumers from suing for "pure emotional harm" over defective products if they've suffered no physical injury.