Please register to access this content.
To continue viewing the content you love, please sign in or create a new account
Dismiss
This content is for our paying subscribers only

World Americas

Facebook, Instagram ‘breeding ground’ for child predators: US lawsuit

Lawsuit filed against Meta to protect children from sexual abuse and human trafficking



The probe revealed that Facebook and Instagram "proactively served and directed the underage users a stream of egregious, sexually explicit images — even when the child has expressed no interest in this content".
Image Credit: IANS

San Francisco: Facebook and Instagram have become a "breeding ground" for child predators targeting children for human trafficking, grooming and solicitation, and certain child exploitative content is over "10 times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans", a lawsuit in the US has claimed.

New Mexico Attorney General Raul Torrez filed the lawsuit against Meta Platforms and its Founder and CEO Mark Zuckerberg to protect children from sexual abuse and human trafficking.

"Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex," said Attorney General Torrez in a press statement.

Undercover investigation

Over the past few months, the New Mexico Attorney General’s Office carried out an undercover investigation of Meta’s platforms, creating decoy accounts of children 14-years and younger.

The probe revealed that Facebook and Instagram "proactively served and directed the underage users a stream of egregious, sexually explicit images — even when the child has expressed no interest in this content".

Advertisement

These platforms also enabled "dozens of adults to find, contact, and press children into providing sexually explicit pictures of themselves or participate in pornographic videos".

"Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children," AG Torrez claimed.

"Despite repeated assurances to Congress and the public that they can be trusted to police themselves, it is clear that Meta’s executives continue to prioritise engagement and ad revenue over the safety of the most vulnerable members of our society," he added.

As outlined in the lawsuit, Meta failed to remove Child Sexual Abuse Material (CSAM) across its platforms and enables adults to find, contact, and solicit underage users to produce illicit pornographic imagery and participate in commercial sex.

Addictive design

The New Mexico Attorney General’s complaint also detailed how Meta harms children and teenagers through the addictive design of its platform, degrading users’ mental health, their sense of self-worth, and their physical safety.

Advertisement

The Office’s investigators found that certain child exploitative content is over 10 times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans.

"Moreover, while the images and case studies included in the complaint are shocking, the New Mexico Attorney General’s Office excluded many images its investigators found on Meta’s platforms from the complaint because they were deemed too graphic and disturbing," it said.

read more

A Meta spokesperson said in a statement the company recently introduced proactive methods for catching and removing accounts and groups that may violate its child safety policies.

"Child exploitation is a horrific crime and online predators are determined criminals. We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators," said the spokesperson.

Advertisement

Meta was also sued in October by 33 US states that alleged it targeted children with addictive features.

Advertisement