Facebook cannot be reformed. This is why
“We know we have more work to do.”
That was the line from numerous Facebook representatives in reaction to the #StopHateForProfit advertising boycott campaign. Intended to pressure the company to curb hate speech and misinformation, the boycott has been joined by several high-profile brands, including Unilever and Verizon, and could make a rare dent in Facebook’s ad revenue.
The campaign seems to be having an effect. Facebook announced that it would add labels to content about voting and expand its hate speech policies.
The company also added a “newsworthy” tag for hateful content from political figures that violates rules but is allowed because of its news value. Facebook stressed that all these moves were part of a continuing cleanup. “We know we have more work to do,” the statement read.
Facebook sold us a utopian vision of a more connected world and left us with our current dystopia. Why can’t those of us who are left to clean up its mess have our own shot at utopia? Either way, we know we have more work to do
We Know We Have More Work to Do (let’s call it W.K.W.H.M.W.T.D. for short) is the definitive utterance of the social media era, trotted out by executives whenever their companies come in for a public shaming.
In just eight words, it encapsulates the defensive posture that Facebook has been crouched in ever since the 2016 election, when it became clear that its tolerance of hate-filled communities on its platforms turned them into witting vectors for disinformation and propaganda.
In Facebook’s case, what is most dangerous about W.K.W.H.M.W.T.D. is that it glosses over the fundamental structural flaws in the platform.
Architecture of the social network
The architecture of the social network — its algorithmic mandate of engagement over all else, the advantage it gives to divisive and emotionally manipulative content — will always produce more objectionable content at a dizzying scale.
Facebook frequently uses its unfathomable amount of content as an excuse for inaction. “We’ve made huge strides,” a Facebook spokesman, Nick Clegg, said on CNN. “But, you know, on an average day, there are 115 billion, 115 billion messages sent on our services around the world, and the vast, vast, vast majority of that is positive.”
But Clegg’s defence is also an admission: Facebook is too big to govern responsibly. There will always be more work to do because Facebook’s design will always produce more hate than anyone could monitor. How do you reform that? You can’t.
Lately, my thoughts on Facebook have been influenced by two separate movements: prison abolition and the push to defund police. There are complex policy issues involved, but the central premise of these movements is elegant in its simplicity. The bloated and corrupt institutions that they critique are beyond reform.
To be clear, there is no one-to-one comparison between Facebook and the police or the carceral state. Modern policing has its origins in slave patrols. Facebook’s origins are obviously much different.
Still, the movements provide a helpful lens through which to view Facebook. Despite the exhausting debates around content moderation policies and constant incremental tweaks to its rules and policies, glaring problems persist. All signs point to a system beyond reform.
Is it really social?
“You see lots of people putting forth a hopeful idea of a new, humane social media platform to rescue us — one that respects privacy or is less algorithmically coercive,” Siva Vaidhyanathan, a professor of media studies at the University of Virginia, told me recently. “But if we’re being honest, what they’re really proposing at that point is not really social media anymore.”
In other words, the architecture is the problem.
Few who know Facebook really believe that Mark Zuckerberg will dismantle his company or relax his grip on the board, placing conversations like this one more in the realm of thought experiment than in reality.
But for those of us who are at the whims of the company’s power, the status quo also seems untenable. Small reforms are crucial, but they also suggest that the current iteration can be saved — that there’s more work to do. Facebook cannot be reformed.
The#StopHateForProfit campaign is one such change, but there are others. Vaidhyanathan told me that he is thinking less about policing Facebook’s platforms; he is trying to imagine ways to help us live in a world dominated by Facebook.
“We probably have to start thinking more radically about what kind of information ecosystem we need to survive as a democratic republic,” he said. His ideas include what he described as “boring” but essential things like investing in libraries and state schools.
There are other ideas, like declaring “platform bankruptcy.” This would involve platforms resetting all of their user and group follower counts to zero and rebuilding communities from the ground up, with the platforms’ current rules in place.
There’s no shortage of ideas on this subject, as my Opinion colleague Annalee Newitz wrote last year: “We need to stop handing off responsibility for maintaining public space to corporations and algorithms — and give it back to human beings.”
I put the question to my Twitter followers, asking for their best ideas to fix tech platforms and received over 1,000 responses in a few hours.
Some were simple: “Design distribution around a different principle than virality.” Others were wonkish: “Cross-company/platform data and research collaborations between trust and safety teams.”
Many were about fundamental transformation: “Ban algorithmic amplification; require proof of safety, efficacy, freedom from bias before product intro; classify personal data as a human right, not an asset.”
There were calls to get rid of metrics and for strict verification of real identities, for the companies to slow down the speed of information. There were privacy solutions, ideas for more tailored community networks.
Many were more blunt: Just shut it all down and start over.
Some of these ideas feel almost too utopian to type, too simple, improbable. But there’s elegance in simplicity; these are visions of an internet we actually want to live on.
Facebook sold us a utopian vision of a more connected world and left us with our current dystopia. Why can’t those of us who are left to clean up its mess have our own shot at utopia? Either way, we know we have more work to do.
— Charlie Warzel is an opinion writer who specialises in technology
New York Times