1.2210615-3550900974
Image Credit: REUTERS

Dubai: Facebook Inc. said it was able to remove a larger amount of content from the Daesh militant group in the first quarter of 2018 by actively looking for it.

The company has trained its review systems - both humans and computer algorithms - to seek out posts from terrorist groups. The social network took action on 1.9 million pieces of content from those groups in the first three months of the year, about twice as many as in the previous quarter. And, 99 percent of that content wasn't reported first by users, but was flagged by the company's internal systems, Facebook said Monday.

Immense power

Facebook, like Twitter Inc. and Google's YouTube, has historically put the onus on its users to flag content that its moderators need to look at. After pressure from governments to recognize its immense power over the spread of terrorist propaganda, Facebook started about a year ago to take more direct responsibility. Chief Executive Officer Mark Zuckerberg earlier this month told Congress that Facebook now believes it has a responsibility over the content on its site.

The company defines terrorists as non-governmental organizations that engage in premeditated acts of violence against people or property to intimidate and achieve a political, religious or ideological aim. That definition includes religious extremists, white supremacists and militant environmental groups. "It's about whether they use violence to pursue those goals."

No numbers

The policy doesn't apply to governments, Facebook said, because "nation-states may legitimately use violence under certain circumstances."

Facebook didn't give any numbers for its takedown of content from white supremacists or other groups it considers to be linked to terrorism, in part because the systems have focused training so far on Daesh and Al Qaeda.

Too passive

Facebook has come under fire for being too passive about extremist content, especially in countries like Myanmar and Sri Lanka where the company's algorithm, by boosting posts about what's popular, has helped give rise to conspiracy theories that spark ethnic violence. People in those countries told the New York Times that even after they report content, Facebook may not take it down.