Instagram parent Meta fined $375 million: A major moment for online child protection

Internal Meta documents revealed staff warned execs about child safety risks

Last updated:
4 MIN READ
Mark Zuckerberg, CEO of Meta, looks on during the US Senate Judiciary Committee hearing "Big Tech and the Online Child Sexual Exploitation Crisis" in Washington, DC, on January 31, 2024.
Mark Zuckerberg, CEO of Meta, looks on during the US Senate Judiciary Committee hearing "Big Tech and the Online Child Sexual Exploitation Crisis" in Washington, DC, on January 31, 2024.
AFP

Dubai: A jury in New Mexico, United States, has ordered Instagram, Facebook, WhatsApp parent, Meta to pay $375 million, roughly Dh1.38 billion based on current exchange rates. in civil penalties after finding the company misled consumers about the safety of its platforms and endangered children.

It is the largest verdict of its kind, and a first: New Mexico has become the first state in the United States to defeat a major technology company at trial over the harm its platforms cause to young people.

The verdict was delivered in the case of State of New Mexico v. Meta Platforms, Inc., following more than two years of litigation by the New Mexico Department of Justice. The jury found Meta liable for both claims brought under New Mexico's Unfair Practices Act and ordered the company to pay the maximum penalty available under the law — $5,000 per violation.

Stay updated: Get the latest faster by downloading the Gulf News app - it's completely free. Click here for Apple or here for Android. You can also find it on the Huawei AppGallery.

Following the landmark verdict, New Mexico Attorney General Raúl Torrez, said, “The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.” 

Torrez added, “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.

A Meta spokesperson said the company "respectfully" disagrees with the decision and plans to appeal. Late last month, as the trial proceeded, Instagram launched new alerts that will notify parents if their teenager repeatedly searches for suicide or self-harm content the platform.

What the evidence showed

The evidence presented at trial included internal Meta documents and testimony from former Meta employees, law enforcement officials, and educators — and what it revealed was, by any measure, damning.

The jury heard that Meta's design features enabled paedophiles and predators to engage in child sexual exploitation on its platforms.

“Evidence from those witnesses and other industry experts also demonstrated that Meta intentionally designs its platforms to addict young people and, contrary to Meta’s public commitments, expose them to dangerous content related to eating disorders and self harm,” the New Mexico Department of Justice said in a statement.

The evidence also showed that Meta executives were warned by their own staff and by outside child safety experts.

"Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew," said Torrez. "Today the jury joined families, educators, and child safety experts in saying enough is enough."

A watershed moment for parents

For anyone who has spent years wondering whether their instincts about these platforms were correct, this verdict arrives with the quiet weight of confirmation.

"This is a watershed moment for every parent concerned about what could happen to their kids when they go online — and this victory belongs to them."

The case began in 2023, when the New Mexico Department of Justice launched an investigation into Meta's platforms to protect children from sexual abuse, online solicitation, and other harms.

What the investigation uncovered over the following two years led directly to this trial — and this verdict. It also follows a two-year long Guardian investigation into Meta’s practices.

Dr Joanne Gray, chair of discipline in media and communications at the University of Sydney, said, "A group of ordinary American citizens did what US regulators have so far failed to do. They looked at the evidence and found that Meta puts profits over user safety. This jury decision sends a clear message to all the Big Tech platforms: they need to do better, especially when it comes to keeping kids safe."

This is far from over...

The $375 million penalty, significant as it is, may not be the end of New Mexico's case against Meta. A further bench trial is scheduled to begin on May 4, during which the state will argue its public nuisance claim and seek injunctive relief — meaning it will ask the court to require Meta to make specific, concrete changes to how it operates.

Those demands include effective age verification, the removal of predators from the platform, and protections for minors against encrypted communications that shield bad actors from detection.

Torrez added, "In the next phase of this legal proceeding, we will seek additional financial penalties and court-mandated changes to Meta's platforms that offer stronger protections for children."

The trial, which began on February 9, is one of the first in a wave of lawsuits against Meta, as school districts and legislators push for greater restrictions on smartphone use in classrooms.

More than 40 state attorneys general across the United States have filed lawsuits against Meta, claiming it has deliberately designed Instagram and Facebook to be addictive, contributing to a mental health crisis among young people.

In a separate federal case in Los Angeles, hundreds of families and school districts have accused Meta, Snap, TikTok, and YouTube of knowingly designing their platforms to addict young users — leading to depression, eating disorders, self-harm, and other serious mental health challenges.

Snap and TikTok have reached settlements in that case. Meta and YouTube continue to contest it. A jury is currently deliberating a verdict.

All companies deny wrongdoing.