Facebook thumbs up 2093
Cambridge Analytica scandal broke, many FB users already had one foot out the door. Image Credit: AFP

I joined Facebook in 2008, and for the most part, I have benefited from being on it. Lately, however, I have wondered whether I should delete my Facebook account.

As a philosopher with a special interest in ethics, I am using “should” in the moral sense. That is, in light of recent events implicating Facebook in objectionable behavior, is there a duty to leave it?

In moral philosophy, it is common to draw a distinction between duties to oneself and duties to others. From a self-regarding perspective, there are numerous reasons one might have a duty to leave Facebook.

For one thing, Facebook can be time-consuming and addictive, to no fruitful end. In addition, as researchers have demonstrated, Facebook use can worsen depression and anxiety.

Addiction

Someone who finds himself mindlessly and compulsively scrolling through Facebook, or who is constantly comparing himself unfavorably with his Facebook friends, might therefore have a duty of self-care to get off Facebook.

From the perspective of one’s duties to others, the possibility of a duty to leave Facebook arises once one recognises that Facebook has played a significant role in undermining democratic values around the world.

For example, Facebook has been used to spread white supremacist propaganda and anti-Semitic messages in and outside the United States.

The United Nations has blamed Facebook for the dissemination of hate speech against Rohingya Muslims in Myanmar that resulted in their ethnic cleansing.

Facebook also enabled the political data firm Cambridge Analytica to harvest the personal information of millions of voters in the United States so they could be targeted with personalized political advertisements.

A significant amount of fake news can be found on Facebook, and for many users, Facebook has become a large echo chamber, where people merely seek out information that reinforces their views.

- S. Matthew Liao

Confirmation bias

A significant amount of fake news can be found on Facebook, and for many users, Facebook has become a large echo chamber, where people merely seek out information that reinforces their views.

Some people might think that because they mostly share photos of their cats on Facebook, such concerns do not apply to them. But this is not so, for three reasons.

First, even if one does not contribute directly to the dissemination of fake news or hang out in echo chambers, simply being on Facebook encourages one’s friends to stay on Facebook, and some of those friends might engage in such activities.

This influence on others is known as a (positive) network effect, where increased numbers of people improve the value of a product.

Second, by being on Facebook one serves as a data point for Facebook’s social media experiment, even if one encounters none of Facebook’s experimental manipulations.

In doing so, one could be helping Facebook to refine its algorithms so that it can better single out specific individuals for certain purposes, some of which could be as nefarious as those of Cambridge Analytica.

Consider an analogy.

When testing the safety and efficacy of new drugs, subjects are randomly assigned either to an experimental group or a control group, and only subjects in the experimental group receive the new drug.

Nevertheless, the subjects in the control group are essential to the experiment.

Third, using Facebook is not just an individual action but also a collective one that may be akin to failing to pay taxes.

A few people failing to pay taxes might not make much of a difference to a government’s budget, but such an action may nevertheless be wrong because it is a failure to participate in a collective action that achieves a certain good end.

In a similar vein, choosing to remain on Facebook might not directly undermine democratic values.

But such an action could also be wrong because we might be failing to participate in a collective action (that is, leaving Facebook) that would prevent the deterioration of democracy.

So do we have an obligation to leave Facebook for others’ sake?

The answer is a resounding yes for those who are intentionally spreading hate speech and fake news on Facebook. For those of us who do not engage in such objectionable behavior, it is helpful to consider whether Facebook has crossed certain moral “red lines,” entering the realm of outright wickedness.

For me at least, Facebook would have crossed a moral red line if it had, for example, intentionally sold the data of its users to Cambridge Analytica with the full knowledge that company would use the data subversively to influence a democratic election.

Likewise, Facebook would have crossed a red line if it had intentionally assisted in the dissemination of hate speech in Myanmar. But the evidence indicates that Facebook did not intend for those things to occur on its platform.

The fact that those things did occur, however, means that Facebook needs to be much more proactive in fixing such problems.

Will it?

The recent worrisome revelation that Facebook hired an opposition-research firm that attempted to discredit protesters by claiming that they were agents of the financier George Soros is not encouraging.

Darkness crowding in

While there still appears to be some daylight between Facebook and what is being done on its platform or in its name, darkness is crowding in.

That said, we should not place the responsibility to uphold democratic values entirely on Facebook. As moral agents, we should also hold ourselves responsible for our conduct, and we should be reflective about what we say, react to and share when we are on social media.

Among Twitter users, a common refrain is “retweets are not endorsements.” In a similar manner, one might also think that “sharing” or “reacting to” are not “endorsements.” This is a mistake. By sharing or reacting to a post, even if one explicitly criticizes the post, one is amplifying the message of that post and signaling that the post warrants further attention.

For now I’m going to stay on Facebook. But if new information suggests that Facebook has crossed a moral red line, we will all have an obligation to opt out.

S. Matthew Liao (@smatthewliao) teaches philosophy and directs the Center for Bioethics at New York University.