1.1938019-2574302226
Mark Zuckerberg gestures while addressing the audience during a meeting of the APEC (Asia-Pacific Economic Cooperation) CEO Summit in Lima, Peru, November 19, 2016. REUTERS/Mariana Bazo/File Photo Image Credit: REUTERS

We finally got a grudging mea culpa from Mark Zuckerberg: an admission that fake news is a significant problem that his social network must help solve.

But as a journalist who has been covering the inner workings of the technology industry for more than a decade, I find the calls for Facebook to accept broad responsibility for fact-checking the news, including by hiring editors and reporters, deeply unsettling.

What those demanding that Facebook accept “responsibility” for becoming the dominant news aggregator of our time seem to be overlooking is that there’s a big difference between the editorial power that individual news organisations wield and that which Facebook could.

Such editorial power in Facebook’s hands would be unprecedented and dangerous.

We can all agree that Facebook should do much more to make sure that blatantly fabricated claims that Donald J. Trump won the popular vote or received the pope’s endorsement don’t spread and are, at a minimum, labelled fakes.

Facebook admits, and my sources confirm, that it can do a better job of this by helping users flag dubious articles and predicting fakes based on data it has for search. This doesn’t have to involve humans. Facebook could decide to label content as suspected as fake if it was flagged a certain number of times and if it displayed other questionable attributes. Such a move would not mean Facebook’s taking broad responsibility for what’s true.

But hiring editors to enforce accuracy — or even promising to enforce accuracy by partnering with third parties — would create the perception that Facebook is policing the “truth,” and that is worrisome. The first reason has to do with the nature of Facebook’s business. The second has to do with the news business.

One thing is clear to anyone who has worked in a newsroom: Not all fact-checking decisions are black and white.

Did the pope endorse Trump? He did not.

But did the FBI reopen the Hillary Clinton email investigation? Well, that’s a little tougher. Although major news outlets like CNN said that it had, the agency didn’t reopen the inquiry, which would have been a far more significant move than what it did do (which was to take a look at newly discovered emails to see if it should reconsider its decision to close the case). Erroneous reporting by established organisations is a bigger threat than fabricated stories, and far more rampant.

News organisations like my own publication make these judgments a million times a day. And we sometimes get them wrong. But we are checked by the power of our competitors and, for news organisations with a subscription business, by readers who stop paying us if we fail them.

To be sure, this business model is under great stress as people lose trust in news organisations. But I don’t believe the solution is to give up on it, particularly if the alternative is to cede the power of authentication to companies like Facebook.

I’m not comfortable trusting the truth to one gatekeeper that has a mission and a fiduciary duty to increase advertising revenue, especially when revenue is tied more to engagement than information. Facebook continues to consider, for example, how it can win approval to enter the Chinese market, including by censoring content. For the company, business can come before truth.

No matter how many editors Facebook hired it would be unable to monitor the volume of information that flows through its site, and it would be similarly impossible for readers to verify what was checked. The minute Facebook accepts responsibility for ferreting out misinformation, users will start believing that it is fact-checking everything on the site.

And what about more private content in groups or messages? For that to be fact-checked, Facebook users would have to trade their privacy (as an analogy, imagine AT&T fact-checking phone calls). That isn’t a position I think Facebook would ever want to be in.

The second reason I am fearful of Facebook as fact checker is what it will do to journalism.

If you don’t believe that Facebook’s policies could sway the news industry, you haven’t been paying attention over the past five years. Publications have been suckered into tweaking their content and their business models to try to live off the traffic Facebook sends them. They’ve favoured Facebook clicks over their core readers, and are no closer to addressing plummeting print revenues. What would happen if the distribution of their articles on Facebook was tied to submitting data about their sources or conforming to some site-endorsed standards about what constitutes a trustworthy news source?

My fellow reporters and editors will argue that I am letting Facebook off too easy. While my husband did work there for a brief period, my position isn’t a defence of the company, which I have covered critically for years. I simply don’t trust Facebook, or any one company, with the responsibility for determining what is true.

— New York Times News Service

Jessica Lessin is the founder and chief executive of The Information.)