OPN YouTube-1560250929027
People pose with mobile devices in front of a screen projected with a YouTube logo Image Credit: Reuters

First, YouTube said a streaming star’s repeated homophobic remarks about a Vox writer did not violate the platform’s rules. Then, after a lot of furious people sent a lot of furious tweets prompting a lot of scorn-filled news stories, YouTube determined the remarks did violate its rules, and announced it would strip advertising from the offending videos.

But wait — there was more. The writer, Carlos Maza, protested that the streamer, Steven Crowder, still sold T-shirts declaring that “Socialism Is for ***.” (A literal fig leaf took the place of the asterisk.) So YouTube said Crowder would have to quit his hawking. Observers were baffled that the bigotry apparently wouldn’t be an issue if it weren’t branded on Hanes cotton, so YouTube tried again: It wasn’t just the T-shirts Crowder had to get rid of to stop being a rule-breaker. It was all that other awful stuff, too.

Finally, at the end of this daylong debacle, YouTube published a blog post explaining its thought process. The platform’s promise? It would update its rules, of course.

While obvious violations can be left to an algorithm, neither computers nor low-level moderators operating under too much stress with too little pay can referee the edge cases that animate our most fractious Twitter fights.

- Molly Roberts

Rules, though, are not worth much unless they serve a principled purpose. And figuring out that purpose could be the most crucial and most complicated challenge for platforms today.

YouTube has policies prohibiting hate speech and harassment, at least in theory. Had YouTube wanted to act against Crowder’s conduct, it could have — citing the rules. And when it didn’t want to act against that conduct, it didn’t — again, citing the rules. This was reminiscent of Infowars founder Alex Jones having been within the rules as he befouled the platforms with hoaxing and hate, until suddenly last summer he found himself banned.

Thorny questions

The primary principle guiding YouTube in these cases appears to have been the imperative to avoid making very many people very angry very publicly. Rules were only, well, a fig leaf.

But really, platforms should write and apply their rules only after facing up to some thorny questions. What’s their responsibility for what happens in the world off the Web? It’s easy to say you’re for free speech. And it’s easy to say you value the safety of your community members. But what do you do when those principles run up against each other? For YouTube and other platforms, these questions are existential. Sometimes the companies offer stages for debate, sometimes information services, sometimes just the easiest way to check up on Grandma. What do they really believe they are?

read more

It’s a lot to grapple with. There’s a risk platforms will become censor-happy and wipe away the openness and empowerment they were built to provide. There’s a risk they won’t do enough and that people will keep getting hurt.

There are practical problems as well as philosophical ones: These sites serve millions or even billions, and they have to operate at scale. While obvious violations can be left to an algorithm, neither computers nor low-level moderators operating under too much stress with too little pay can referee the edge cases that animate our most fractious Twitter fights. Still, that’s no excuse not to do the grappling.

YouTube could have said from the outset last week that its commitment to providing a forum for public figures to argue with each other over political topics outweighed its commitment to preventing cruelty or protecting the marginalised, if that’s what it believes. It could have said the opposite. Or it could have said that the doxxing and other attacks Maza was suffering from Crowder’s followers made the balancing act moot, because any content that leads to real-world harm is unacceptable.

A foundation of principle

Instead of dispensing a bunch of unsigned tweets full of Byzantine reversals, YouTube also could have told us how it was coming to its decision. Better yet, it could establish a transparent process, like the sort of oversight board Facebook is crafting, untethered to profits or stock price. Platforms need to do the work and show their work — show not only that they have rules but also that those rules are built on a foundation of principle and that the principle is more meaningful than just trying to avoid making people angry on the internet. Which is an impossible goal anyway.

— Washington Post

Molly Roberts is a columnist who specialises in technology and society.