1.2136564-1554860885

As the CEO of YouTube, I’ve seen how our open platform has been a force for creativity, learning and access to information. I’ve seen how activists have used it to advocate social change; how it serves as both an entertainment destination and a video library. How it has expanded economic opportunity, and helped enlighten my children, giving them a bigger, broader understanding of our world and the billions who inhabit it. But I’ve also seen that there can be another, more troubling, side to YouTube’s openness. Some are exploiting it to mislead, manipulate, harass or even harm.

In the last year, we took action to protect our community against violent or extremist content. We tightened our policies on what can appear, or earn revenue for video creators. We increased our enforcement teams. And we invested in new machine learning technology to scale the efforts of our human moderators to take down content that violates our policies. Now, we are applying the lessons we’ve learned in order to tackle other problematic content. Our goal is to stay one step ahead, making it harder for policy-violating content to surface or remain on YouTube.

Human reviewers remain essential to both removing content and training machine learning systems, because human judgement is critical to making contextualised decisions. Since June, our teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine learning technology to identify similar videos. We are also taking aggressive action on comments. In the last few weeks we’ve used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments. Our teams work closely with child-safety organisations around the world to report predatory behaviour and accounts to the correct law-enforcement agencies. We will continue the significant growth of our teams, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.

We will use our cutting-edge machine-learning more widely to allow us to quickly remove content that violates our guidelines. Since June we have removed over 150,000 videos for violent extremism. Machine learning is helping human reviewers remove nearly five times as many videos as they were previously. Today, 98 per cent of the videos we remove for violent extremism are flagged by our machine-learning algorithms, with nearly 70 per cent of violent extremist content taken down within eight hours of upload and nearly half of it in two hours. Since June, machine-learning technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week over that period to assess.

Because we have seen these positive results, we have begun training machine-learning technology across other challenging content areas, including child safety and hate speech. We understand that people want a clearer view of how we’re tackling problematic content. Our community guidelines give users notice about what we do not allow and we want to share more information about how these are enforced. That’s why in 2018 we will be creating a regular report where we will provide more aggregate data about the flags we receive and the actions we take to remove content that violates our policies.

We’re also taking action to protect advertisers and creators from inappropriate content. We want advertisers to have peace of mind that their ads run alongside content that reflects their brand’s values. Equally, we want to give creators confidence that their revenue won’t be hurt by the actions of bad actors. We believe this requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for it. We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will also help vetted creators see more stability around their revenue.

We are taking these actions because it’s the right thing to do. Creators make incredible content that builds global fan bases. Fans come to YouTube to watch, share and engage with this content. Advertisers, who want to reach those people, fund this creator economy. Each group is essential to our creative ecosystem, and all three deserve our best efforts. As challenges to our platform evolve and change, our enforcement methods must evolve to respond to them. But no matter what challenges emerge, our commitment to combat them will be unwavering. We will take the steps necessary to ensure that YouTube continues to be a place where creators, advertisers, and viewers can thrive.

— The Telegraph Group Limited, London, 2017

Susan Wojcicki is the CEO of YouTube. Twitter: @SusanWojcicki