Facebook’s chief executive, Mark Zuckerberg has announced the company’s “New Steps to Protect the US Elections.”
They include blocking new political ads in the week leading up to Election Day and attaching labels to posts containing misinformation, specifically related to the coronavirus and posts from politicians declaring victory before all the results are counted.
One can — and many will — debate just how effective these measures will be at preventing election night chaos during a pandemic. (So far Facebook’s “misleading post” labels are vague to the point of causing additional confusion for voters.
Facebook’s news dominance and mercurial distribution algorithms led to a rise of hyperpartisan pages and websites to fill the gaps and capitalise on the platform’s ability to monetise engagement, which in turn led to a glut of viral misinformation and disinformation that Facebook has been unable (or perhaps unwilling) to adequately police
Similarly, blocking new political ads one week out from the vote ignores the vast amounts of disinformation Americans are subjected to year after year.) But what seems beyond debate is just how deeply Facebook has woven itself into the fabric of democracy.
Reading Zuckerberg’s election security blog post reminded me of a line from a seminal 2017 article by the journalist Max Read. Three years ago, Read was struck by a similar pledge from Zuckerberg to “ensure the integrity” of the German elections.
Facebook’s immense power
The commitment was admirable, he wrote, but also a tacit admission of Facebook’s immense power. “It’s a declaration that Facebook is assuming a level of power at once of the state and beyond it, as a sovereign, self-regulating, suprastate entity within which states themselves operate.”
That power is consolidated in the decisions of its chief executive, who has voting control over the company. Here’s how Facebook’s co-founder Chris Hughes described Zuckerberg’s iron grip on the company last year: Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government.
He controls three core communications platforms — Facebook, Instagram and WhatsApp — that billions of people use every day. Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 per cent of voting shares.
Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.
Consolidation of power
If Hughes’s description feels hyperbolic, it may be because such a consolidation of power is actually hard to comprehend.
“I think we underestimate Facebook’s power constantly,” Siva Vaidhyanathan, a professor of media studies at the University of Virginia, told me. “It’s really hard for human beings to picture in their head the actual size and influence of the platform.
Something like one out of three people use the thing — it’s like nothing we’ve encountered in human history. And I’m not sure Mark Zuckerberg is even willing to contemplate his influence. I’m not sure he’d ever sleep if he ever thought about how much power he has.”
Facebook’s power is now self perpetuating. This week provided a great example. On Tuesday, Facebook and other platforms revealed a covert operation run by the Kremlin-backed internet Research Agency to sow division ahead of the presidential election by setting up a network of fake user accounts and websites.
This time, though, the agency hired unwitting American freelance journalists to create the content.
There’s a grim circle-of-life quality to this news. Facebook’s unprecedented growth and commandeering of the digital advertising market — alongside Google and others — helped accelerate the collapse of journalism’s broken business models. This led to consolidation, publications shuttering and layoffs of journalists everywhere.
Facebook’s news dominance and mercurial distribution algorithms led to a rise of hyperpartisan pages and websites to fill the gaps and capitalise on the platform’s ability to monetise engagement, which in turn led to a glut of viral misinformation and disinformation that Facebook has been unable (or perhaps unwilling) to adequately police.
This free-for-all has made Facebook the platform of choice for political manipulation. Those bad actors are now hiring and exploiting the very freelance journalists displaced by the collapse of the media industry that Facebook helped accelerate.
Eventually, Facebook takes action to remove the bad actors, assuring the country of its commitment to democracy and cementing its role as a protector of free and fair elections.
Facebook wins in every direction. Its size and power creates instability, the answer to which, according to Facebook, is to give the company additional authority.
This cycle is unsustainable. This summer has shown that the platform has been a prime vector for the most destabilising forces in American life. It has helped supercharge conspiracies around the dangerous QAnon movement.
It has provided organisation for, and amplified calls to action from, militia movements, which have been linked to deaths in US cities at protests.
Its moderation policies have failed to catch blatant rule violations around voter disenfranchisement, and the conspiracy theories that go viral on the platform have found their way, time and again, to President Trump’s mouth.
Facebook employees seem to understand the situation is untenable and are speaking out internally against Zuckerberg’s leadership.
“He seems truly incapable of taking personal responsibility for decisions and actions at Facebook,” one Facebook employee told BuzzFeed News after a company meeting in response to the violence in Kenosha, Wis.
With just two months to go before the election, the nation’s focus is on the integrity of the electoral process. With the president threatening to undermine the results of the election, the stakes could not be higher. As Zuckerberg wrote on Thursday, “We all have a responsibility to protect our democracy.”
But what does it say that one of those institutions charged with protecting democracy is, itself, structured more like a dictatorship?
“Facebook had grown too big, and its users too complacent,” Read concluded at the end of his 2017 piece. His words feel prescient today as Facebook, unchecked and unregulated by governments, positions itself as a primary line of defence to protect those institutions.
At first, Zuckerberg’s recent election pledge might feel comforting. But his plan is an admission of a great power that should make us uncomfortable.
In our quest to fend off a would-be strongman’s power grab in one realm, we ought not allow a stronger man’s power grab in another.
Charlie Warzel is a senior writer and tech columnist
New York Times