Recently the US Senate Commerce Committee grilled the CEOs of Facebook, Twitter, and Google. With the GOP on the hunt for partisan bias and the Democrats urging greater efforts to reduce misinformation, both sides ignored some fundamental principles of democracy.
The ostensible purpose of the hearing was to resume the argument over whether to amend Section 230 of the Communications Decency Act. In truth, Republicans called the tech CEOs to press them on their handling of a controversial New York Post story that alleges wrongdoing by Hunter Biden. Democratic senators responded that the GOP was trying to “bully” the techies.
Well, goodness
Let’s start with a reminder that the social media companies are private enterprises, and they’re clothed with a First Amendment right to curate content on their sites as they like. Yes, absolutely, one might sometimes wish that they acted in a more principled and even-handed manner, but did I happen to mention that they’re private enterprises?
It’s true that misinformation is rampant online. One is reminded of what Isaac Asimov called Gennerat’s Law: “The falsely dramatic drives out the truly dull.” There’s a lot of the falsely dramatic floating around out there, and people tend to gravitate toward the bits that make the other side look worse.
Nevertheless, the tech giants, by passing judgement on what’s too unreliable to be seen, are taking tentative steps down a road that’s rarely led anywhere good. Even private restriction, although not matching any of the classic definitions of censorship, betrays a kind of hubris “- what John Stuart Mill famously derided as a belief in one’s own infallibility. Worse, what tends to motivate the removal of bad information is a fear of the danger posed by whatever is being omitted or suppressed “- a worry about what might happen should the wrong people wind up seeing it.
The deep problem here isn’t that the companies often act as though they’re wearing partisan blinders. The problem is that even were the work done with perfect political neutrality, the determination to avoid the use of a platform to spread “misinformation” would still display the same basic attitude. When a platform spots a piece it considers suspect and its staff or review partners say, “Nope, can’t let people see this,” the unspoken message is, “We here at Twinstabook are clever enough to understand what’s really going on. The people who rely on our platform aren’t.”
From climate change to Covid-19
On issues from climate change to Covid-19, the social media companies often take the view that there are arguments too dangerous to allow their users to see. I agree that climate change poses a dangerous threat and that bad advice about the novel coronavirus could lead to a deadlier spread. But it’s an enormous leap from holding a position, even passionately, to believing that others shouldn’t be treated as wise enough to make up their own minds.
Yes, the public square is awash in misinformation. It has been ever thus. I’m of the generation trained to believe that the cure for bad information is good information. If people are sometimes persuaded by the false, that’s a risk attendant upon the proper practice of democracy.
Nowadays, when we say “democracy” we almost always think of voting. But I cling to a classical vision in which voting is only one piece of what makes democracy valuable. More vital is acknowledging our joint participation, together with coequals, in a common enterprise of self-governance; an enterprise in which we respect, among other things, the ability of our fellow citizens to decide for themselves which argument to accept. When a point of view is suppressed because those who hold the power to shape dialogue consider it wrong “- even dangerously wrong “- we’re engaged in the opposite of democracy. Censorship deprives individuals of the ethical right to decide for themselves what to believe. The fact that a private company has the unquestioned freedom to violate that ethical right doesn’t mean that it should.
Stephen L. Carter is a columnist and writer. He is a professor of law at Yale University
Bloomberg