2019-03-15T060457Z_676049894_RC13AED76930_RTRMADP_3_NEWZEALAND-SHOOTOUT-(Read-Only)
A still image taken from video circulated on social media, apparently taken by a gunman and posted online live as the attack unfolded, shows him retrieving weapons from the boot of his car in Christchurch, New Zealand. Image Credit: Reuters

San Francisco - As a grisly video recorded by the alleged perpetrator of Friday’s bloody massacres at two New Zealand mosques played out on YouTube and other social media, Neal Mohan, 3,700 miles away in San Bruno, California, had the sinking realisation that his company was going to be overmatched - again.

Mohan, YouTube’s chief product officer, had assembled his war room - a group of senior executives known internally as “incident commanders” who jump into crises, such as when footage of a suicide or shooting spreads online.

The team worked through the night, trying to identify and remove tens of thousands of videos - many repackaged or recut versions of the original footage that showed the horrific murders. As soon as the group took down one, another would appear, as quickly as one per second in the hours after the shooting, Mohan said in an interview.

As its efforts faltered, the team finally took unprecedented steps - including temporarily disabling several search functions and cutting off human review features to speed the removal of videos flagged by automated systems. Many of the new clips were altered in ways that outsmarted the company’s detection systems.

“This was a tragedy that was almost designed for the purpose of going viral,” Mohan said in an interview with The Washington Post that offered YouTube’s first detailed account of how the crisis unfolded inside the world’s largest video site. “We’ve made progress, but that doesn’t mean we don’t have a lot of work ahead of us, and this incident has shown that, especially in the case of more viral videos like this one, there’s more work to be done.”

We’ve made progress, but that doesn’t mean we don’t have a lot of work ahead of us, and this incident has shown that, especially in the case of more viral videos like this one, there’s more work to be done.

- Neal Mohan | YouTube’s chief product officer

The uploads came more rapidly and in far greater volume than during previous mass shootings, Mohan said. Video, mainly from the point of view of victims, spread online from the shootings at a concert in Las Vegas in October 2017 and at a Pittsburgh synagogue this past October. But neither incident included a live-stream recorded by the perpetrator. In New Zealand, the shooter apparently wore a body-mounted camera as he fired into crowds of worshipers.

Each public tragedy that has played out on YouTube has exposed a profound flaw in its design that allows hate or conspiracies to flourish online. Despite being one of the crown jewels of Google’s stable of massively profitable and popular online services, for many hours, YouTube could not stop the flood of users who uploaded and re-uploaded the footage showing the mass murder of Muslims. About 24 hours later - after round-the-clock toil - company officials felt the problem was increasingly controlled, but acknowledged that the broader challenges were far from resolved.

“Every time a tragedy like this happens we learn something new, and in this case it was the unprecedented volume” of videos, Mohan said. “Frankly, I would have liked to get a handle on this earlier.”

The company - which has come under increasing fire for allowing Russians to interfere in the 2016 election through its site and for being slow to catch inappropriate content - has worked behind the scenes for more than a year to improve its systems for detecting and removing problematic videos. It has hired thousands of human content moderators and has built new software that can direct viewers to more authoritative news sources more quickly during times of crises. But YouTube’s struggles during and after the New Zealand shooting have brought into sharp relief the limits of the computerized systems and operations that Silicon Valley companies have developed to manage the massive volumes of user-generated content on their sprawling services.

In this case, humans determined to beat the company’s detection tools won the day - to the horror of people watching around the world.

YouTube was not alone in struggling to control the fallout Friday and over the weekend. The rapid online dissemination of videos of the terrorist attack - as well as a 74-page manifesto, apparently written by the shooter, that railed against Muslims and immigrants - seemed shrewdly planned to reach as many people online as possible.