1.1355860-3070033020
mark zuckerberg PHOTO:GULFNEWS ARCHIVE

There is an argument that the latest Facebook scandal is a lot of fuss about nothing. A week-long psychological experiment on 690,000 users in 2012, that did no damage and had a barely noticeable effect, hardly registers on the scale of research abuses over the years.

Nor does hiding a few positive and negative posts from a small fraction of Facebook’s 1.3 billion users in order to monitor “emotional contagion” compare with the harm to which other companies sometimes expose customers. Drivers of faulty cars and consumers of processed food stuffed with fat and salt take far bigger risks with their health. Facebook has made no secret of the fact that its news feed is a manipulated version of reality. It selects the posts and links to display prominently what it has found through testing are the most likely to interest users, and encourage them to return and post themselves. These tests are not sinister experiments; they are product development.

All true, but there is one big difference: We are the product that Facebook has been testing. Perhaps we should grow up and accept that this is how the world works when we use an advertising-funded social network stuffed with details of our lives and those of family and friends. But if we find it creepy, that is what the experiment proves. With this blunder, Facebook’s data scientists have drawn to attention two things of which most of us may have been vaguely aware, but have not pondered very hard.

First, Facebook does not feel the need to ask for permission before carrying out its tests. Its terms of use, the screed of text that most of us scroll through rapidly to click “yes” at the end and move on, includes a reference to using data to “improve” the product. Since 2012, the word “research” has also been included. That is it. Its research paper, published in the Proceedings of the National Academy of Science of the United States of America, claims the “informed consent” of Facebook users, which is blatantly false. Making all users agree to a catch-all list that does not provide a clear, specific description of a study fails the 1979 consent test for US academic research .

Edward Felten, a professor of computer science and public affairs at Princeton University, describes its terms of use as “a legal fiction of consent” in academic terms. Even commercially, Facebook is in a privileged position. Many companies such as Unilever and Procter & Gamble carry out psychological studies, but they recruit subjects: You are not made to sign a research waiver to buy an ice-cream.

As well as being big, Facebook holds more intimate information about its users than other internet companies. The algorithm that controls the news feed is similar to that which powers Google’s search rankings: Both list material by how relevant it is, using an array of data. Google’s display is also influenced by users’ interests and responses.

Second, Facebook wields incredible power over the behaviour of users. This is partly the result of its sheer size. Another research study by Facebook’s scientists, on how information is spread by networks of friends, notes that “our sample consists of approximately 253 million” users. In other words, they experimented on the equivalent of four times the population of France.

The emotional contagion paper concludes that “given the massive scale of social networks such as Facebook, small effects can have large aggregated consequences”. This experiment with altering users’ news feeds only had a tiny impact on their behaviour but even it “would have corresponded to hundreds of thousands of emotion expressions in status updates per day”.

Google, however, analyses material from across the web, whereas Facebook focuses its judgements on personal material. An algorithm that selects from thousands of links about, say, Buckingham Palace feels like a service; one that weeds out the posts of friends and family feels like a moral guardian.

Facebook has demonstrated that it can alter behaviour — that is what its experiments are about. “We regularly run tests to work out how to make the experience better. Through testing, we have found that when people see more text status updates on Facebook, they write more status updates themselves,” a manager wrote in January.

Occasionally, it uses its powers for a specific purpose, as in 2012 when founder Mark Zuckerberg nudged Facebook users towards being organ donors by allowing existing donors to display that status and to encourage friends to follow them (it had a significant take-up). Mostly, it tinkers with the algorithm with the sole purpose of stimulating user growth and activity.

Zuckerberg and other Facebook executives share the Panglossian philosophy that what is good for its users — sharing material with their friends — is good for Facebook, and whatever promotes it is beneficial for everyone. “The goal of news feed is to deliver the right content to the right people at the right time” is the service’s oft-repeated, deceptively simple nostrum. This accounts for the innocently surprised tone of the apology offered last week by Adam Kramer, the Facebook data scientist who designed the controversial emotional contagion study. “The goal of all of our research at Facebook is to learn how to provide a better service ... I can tell you that our goal was never to upset anyone,” he wrote.

Naturally, it was not. But the US academic research guidelines were drawn up long ago to stop people who believed they had the greater good in mind behaving as they thought was best. It is time for Facebook to read them.

— Financial Times