Courts challenge platform design as parents demand stronger safeguards

Two back-to-back jury verdicts in the United States this week have delivered one of the strongest legal challenges yet to the business models of social media companies, raising fresh questions about accountability, platform design and child safety.
In California and New Mexico, juries found Meta — and in one case Google-owned YouTube — liable for harm caused to young users, with combined penalties and damages reaching $381 million. Beyond the financial impact, the rulings reflect a deeper shift: courts and juries are increasingly willing to accept arguments that platform design — not just user behaviour — contributes directly to mental health harm among children.
Parents and child safety advocates see the decisions as long overdue. For years, social media companies have argued that harms stem from broader societal issues or misuse by individuals. This week’s verdicts suggest juries are no longer convinced — and could mark the beginning of a broader legal and regulatory push against Big Tech.
Two separate juries delivered rulings within 24 hours:
New Mexico (Tuesday): A jury found Meta liable for harming children’s mental health and exposing them to risks such as sexual exploitation. It imposed a $375 million penalty under state consumer protection law.
Los Angeles (Wednesday): A jury found Meta and YouTube liable for harming a young user through addictive platform design, awarding $6 million in damages, including $3 million in punitive damages.
Together, the rulings mark one of the most significant legal setbacks yet for social media companies.
The decisions reflect a broader shift in how courts — and the public — view social media platforms.
For years, companies argued that they were not responsible for user behaviour or content harms. But these cases focused instead on product design, including features such as autoplay, notifications and infinite scrolling.
“For the first time, courts have held social media platforms accountable for how their product design can harm users,” said Nikolas Guggenberger, an assistant professor of law.
Legal experts say this approach could reshape future lawsuits by bypassing protections under Section 230, which generally shields platforms from liability for user-generated content.
The Los Angeles jury concluded that:
Meta and YouTube were negligent in designing their platforms
Their design was a substantial factor in causing harm
They knew or should have known their services posed risks to minors
They failed to adequately warn users
Jurors also found that both companies had acted with “malice, oppression or fraud”, leading to additional punitive damages.
The plaintiff said her near-constant use of social media “really affected my self-worth.”
The New Mexico lawsuit, filed by Attorney General Raúl Torrez, focused on:
Exposure of children to sexual predators
Allegations that Meta knowingly harmed children’s mental health
Claims that the company concealed risks
Investigators built the case by posing as children and documenting interactions on the platforms.
The jury awarded $375 million, though prosecutors may seek additional remedies in a second phase of the trial.
Meta found liable in two separate jury cases
YouTube found liable in California case
$375 million penalty imposed in New Mexico
$6 million damages awarded in California
Jury found negligence and failure to warn users
Design features like autoplay and infinite scroll scrutinised
Companies plan to appeal verdicts
Thousands of similar lawsuits pending
For many families, the verdicts are both validation and heartbreak.
Brian Montgomery, whose 16-year-old son died after being targeted in a sextortion scheme, called it a turning point: “We’re talking about the most financially sound business that the planet has ever known. This will set an expectation.”
Deb Schmill, whose daughter died after buying drugs through a social media platform, said: “That’s the painful part of all of this… If this could have been done five years ago, 10 years ago. Things would be so different.”
Many parents say safeguards have come too late — and are now pushing for legislation.
Both companies have rejected the findings and plan to appeal.
Meta said it “respectfully disagree[s] with the verdict,” adding that “teen mental health is profoundly complex and cannot be linked to a single app.”
A Google spokesperson said the case “misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”
Not immediately.
The rulings do not mandate specific design changes and appeals could elay or overturn penalties. A second phase in the New Mexico case could impose restrictions
However, experts say the real impact may come later.
Jury trials “level the playing field” for large tech companies, said former Meta engineer Arturo Bejar, adding that meaningful change often comes when regulators intervene.
So far, the financial effect appears limited.
Meta reported $201 billion in revenue last year, far exceeding the penalties imposed.
Investors have largely shrugged off the verdicts, with the company’s stock closing slightly higher after the rulings.
But analysts warn that forced changes to platform design could pose a far greater long-term risk to business models.
Yes — and this may be just the beginning.
Thousands of similar cases are already pending
The California case is a bellwether trial, guiding future litigation
Attorneys general in more than 40 states are pursuing lawsuits against Meta
These cases could eventually lead to a broader settlement similar to past actions against tobacco or opioid companies.
While the cases are in the US, the implications are global. There could be increased pressure on platforms to improve child safety features and potential for stricter regulation worldwide. This could also lead to growing awareness among parents about screen time and online risks.
A 2025 Pew Research survey found 48% of teens believe social media harms people their age, up from 32% in 2022.
The legal process is far from over:
Appeals could take years
Additional damages or remedies may be imposed
Lawmakers may step in with new regulations
Meanwhile, attention is already shifting to the next frontier — including risks from AI-driven platforms and chatbots.
- with inputs from AFP and AP
Sign up for the Daily Briefing
Get the latest news and updates straight to your inbox
Network Links
GN StoreDownload our app
© Al Nisr Publishing LLC 2026. All rights reserved.