Consider all the ways that governments are proposing to rein in Facebook. The gamut runs from regulatory fines to threats to dismantle the behemoth. Some of these measures are counterproductive. Regulators are trying to address Facebook as if it’s like companies they have encountered before. But Facebook presents radically new challenges. It is unlike anything else in human history — with the possible exception of Google.
Just this month the United Kingdom proposed a “duty of care” standard for platform companies to ensure they filter potentially harmful content. The government of Canada last week declared that Facebook had broken the law by failing to protect users’ data from flowing to the political consulting firm, Cambridge Analytica. Irish regulators started investigating Facebook for failing to protect users’ passwords. The government of Sri Lanka temporarily shut down Facebook and WhatsApp after a terrorist attack on Christians killed more than 250 people. And Facebook told its investors that it expects the US government to issue a fine of up to $5bn for violating a 2011 order that was supposed to prevent the distribution of personal data to the likes of Cambridge Analytica.
Each of these regulatory measures hope to address one negative consequence of Facebook at a time. No one, it seems, is prepared to consider Facebook (and its other global services, WhatsApp and Instagram) in its totality.
It’s as if governments around the world are addressing individual weather systems as they hit and do harm. But no one is considering the dangers of climate change.
That said, Facebook is but one node (albeit the largest and most powerful one) in a matrix of surveillance systems covering every aspect of our lives. In much of the world, the major systems of surveillance are state-based. In the rest of the world, they are commercial. But data flow easily between states and commercial enterprises.
The problems Facebook causes or amplifies — data dumps, privacy violations, the proliferation of hate speech or other nonsense — are not glitches. They are not examples of Facebook failing. They are examples of Facebook working as designed.
Facebook does three things. It collects records of our activities, proclivities, locations, and associations. It uses those data to position advertisements that have proven more effective yet less expensive than in any other medium. And it uses those data to choose for us what we shall see, read, and with whom we should interact through its system. Its algorithms structure our social lives so subtly we hardly notice. It influences what we consider true, important, and valuable in powerful ways we are only now starting to realise.
Overall, Facebook undermines our ability to communicate on our own terms, to deliberate about public issues in a sober and informed fashion, and to build trust among citizens. The macro effect is so much more dangerous than any particular abrogation of user trust or violation of privacy law.
Facebook is such a powerful and pervasive global system that confronting it demands radical new thought. It reaches more than 2.3 billion people, and that means more than 2.3 billion people regularly post videos, photos, and text to Facebook. They do so in more than 110 languages. The very idea that Facebook can police itself is absurd.
Beyond scale, Facebook operates across most of the world without serious competition for advertisements or attention. Among the five social media platforms with more than one billion users, four of them (Facebook, Messenger, WhatsApp and Instagram) are owned by Facebook. The fifth, WeChat, operates primarily in China, where Facebook does not. The only platform that competes for attention with Facebook at that scale is YouTube (owned by Google), with about 2 billion viewers. But it performs different functions in our lives.
Here is the trick: Facebook operates globally but works differently locally and regionally. People use Facebook differently in the Philippines, where it is close to the only source of information and method of personal communication for most citizens, than they do in Germany, where it is fairly peripheral to people’s daily lives. Even India and Sri Lanka have very different experiences with Facebook and WhatsApp because they have distinct histories and political systems.
So how should we approach this weird monster? Each country will have to assess how its social, cultural and political health is affected by Facebook. Each will have to approach Facebook as part of an information ecosystem, connected intimately with other systems of expression and media forms like television and news services. Each will have to assess how much power it wants Facebook to have in that ecosystem. Each will have to deploy an array of responses to mitigate the negative consequences of Facebook while recognising its value in people’s lives (the places where it is most valuable to people, such as in Sri Lanka or Myanmar, it also does the most damage).
Governments and citizens will have to consider advertising taxes, financial penalties, restrictions on data collection, restrictions on data use, transparency about how the algorithms work, and restrictions on some content itself (where permitted). Some of the larger powers, such as the United States and the European Union, should consider severing WhatsApp and Instagram from Facebook so there is some semblance of competition.
Most importantly, states should consider their approach to Facebook as a comprehensive program. Facebook has the power and money to absorb any meagre fine any government wants to place on it. After three years of non-stop bad news about the company, its popularity, revenue, and stock price keep rising. Facebook operates beyond the reach of states and market forces. This is not healthy. Addressing Facebook properly will require years of study. It will also require fresh thought and bold creativity.
— Guardian News & Medial Ltd
Siva Vaidhyanathan is a professor of Media Studies at the University of Virginia and the author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy