New York
Facebook Chief Executive Officer Mark Zuckerberg has had a very bad week, even in the context of a very bad year.
The week of bad news actually started March 8 with a proposal from US senator and presidential candidate Elizabeth Warren to break up the company. Then there was the longest-ever outage of Facebook’s social network and services, which almost overshadowed news of a criminal investigation into its data agreements with other companies. Facebook’s technical glitch was resolved just in time for it to post the departure of two key executives, including the one closely linked with the company’s most iconic product. But the ultimate blow came on Friday with the massacre of 50 people in New Zealand, streamed live on Facebook.
“Hedge funds who were previously complacent about the recent negative headlines are raising eyebrows on the news overnight,” Lynx Equity Strategies analyst Jahanara Nissar wrote in a note. The departure of two top executives also was “concerning — especially given that the conflict was over strategy.”
The snowball of bad news is catching up with the company. The shares had their worst day in more than two months Friday, falling 2.5 per cent to close at $165.98.
The weekend didn’t offer much respite. Facebook said it managed to prevent 1.2 million uploads of the massacre video in the first 24 hours, but 300,000 versions made it to the platform before being removed. New Zealand Prime Minister Jacinda Ardern said she wants talks with Facebook on the issue of live streaming. AirAsia Group Bhd Chief Executive Officer Tony Fernandes, one of Asia’s best known executives, said Sunday he shut his account with 670,000 followers because of the “amount of hate” on social media.
Acting White House Chief of Staff Mick Mulvaney said on CBS’s “Face the Nation” that the question is “how do you stop these crazy people” who are “willing to go on live TV and stream the murder of people.” He said “Donald Trump is no more to blame from what happened in New Zealand than Mark Zuckerberg.”
It might be time to force social-media providers to delay live broadcasting or streaming, Tom Bossert, former homeland security adviser for President Donald Trump, said on ABC’s “This Week” on Sunday.
“It’ll require some time and money, but I think it’s something that we should consider,” Bossert said.
Jeh Johnson, former Homeland Security secretary under President Barack Obama, said on ABC that can be done but government must be careful about regulating speech. Social media and internet service providers must be more vigilant about self-regulating hate speech content that violates their terms of service, he said.
Facing scrutiny
Negative sentiment toward Facebook, as measured in tweets on Twitter, rose to the highest in almost eight months on Thursday. While sentiment can rise and fall with the thousands of daily tweets about the company, Facebook hadn’t seen that many negative comments since July, the day after disappointing revenue and user growth figures prompted the stock’s biggest-ever sell-off.
Warren’s proposition reflects a new, troubling paradigm for Facebook: in growing numbers, consumers, lawmakers and investors are asking whether the company founded in Zuckerberg’s Harvard University dorm room in 2004 is doing more harm than good. The complaints are growing louder that Facebook has done a poor job of safeguarding data or protecting users from the spread of hate speech, disinformation and live footage of violent events. Maybe, pundits were wondering aloud, it’s time for regulators and politicians to step in.
And that was before the tragedy in Christchurch, New Zealand, which alone would have been enough to prompt soul searching in any CEO. For Zuckerberg, it couldn’t have come at a worse time.
Days after Warren unveiled her break-up plan, Facebook pulled her ads on the platform for the proposal. It wasn’t a good look for Zuckerberg’s defence of the social network as a place for public debate and people of all views. Facebook said Warren’s ads violated company policy against the use of the corporate logo, but “in the interest of allowing robust debate, we are restoring the ads.”
Double crisis
By Wednesday, Facebook was facing two new crises. Beginning about 11:15am. New York time, Facebook’s apps and sites from the news feed to Instagram and WhatsApp started going down around the world. The problems people experienced varied, from slow load times for pages to seeing no content at all or trouble sending messages. The outage continued into Thursday afternoon, the longest time Facebook’s properties have been recorded as offline since 2012. Facebook said the problem was a result of a shift in the set-up of its computer servers. “We are very sorry for the inconvenience and we appreciate everyone’s patience,” the company said.
Just as concern over the outage was reaching its peak, the company was beset by news that an investigation by the US Justice Department was broadening to include a federal grand jury in New York, a person familiar with the matter told Bloomberg News. The grand jury has subpoenaed records from at least two smartphone makers and other electronic devices that had partnerships with Facebook, the New York Times reported, citing unidentified people familiar with the requests.
“As we’ve said before, we are cooperating with investigators,” Facebook said.
The company is facing ongoing probes around the world into alleged privacy violations revealed last year stemming from its relationship with Cambridge Analytica, a political consultancy that obtained the data of millions of the site’s users without their consent. The US Federal Trade Commission said last month it was creating a task force to look into possible anticompetitive conduct by Facebook, Alphabet Inc’s Google and other technology companies and several state attorneys general also are probing Facebook’s privacy practices.
“We’ve provided public testimony, answered questions, and pledged that we will continue to do so,” Facebook said.
Partly in reaction to the pressure Facebook has been under to change how it handles user privacy, rein in fake news and monitor offensive or violent content, Zuckerberg recently announced a pivot in product development to focus on private, ephemeral and encrypted communication. It was a striking change for a company that built its business on open sharing. And likely prompted the departure of one of Facebook’s top executives.
Chris Cox, who had worked at the company for 13 years, announced he was leaving in a Facebook post on Thursday. Cox had helped invent and develop the news feed, the main channel for personalised life updates for more than 2 billion people — essentially the algorithm-based editor-in-chief of users’ digital lives.
Cox’s departing post alluded to a different view than Zuckerberg’s about Facebook’s future. “We are turning a new page in our product direction,” Cox said. “This will be a big project and we will need leaders who are excited to see the new direction through.” Chris Daniels, who ran WhatsApp, is also leaving Facebook, the company said.
But none of those issues will test Zuckerberg quite like the tragedy in New Zealand.
The slaughter in two mosques came after someone appearing to be the gunman posted links to a lengthy racist manifesto on the site and a forum known for extremist views. His first-person view of the carnage as he wreaked it immediately was spread across the internet. Facebook said it “quickly removed both the shooter’s Facebook and Instagram accounts,” and was taking down any mentions of praise or support for the shooting. But this was exactly the kind of event Zuckerberg has pledged to work harder to avoid. The company has hired thousands of people to manually screen offensive and dangerous content and ploughed money into technology like artificial intelligence to more efficiently analyse and filter live video content. And yet.
Zuckerberg, 34, has acknowledged the difficulty of policing content from the 2.7 billion users that power Facebook’s wildly profitable advertising engine. The company’s business model depends on showing people posts they’re most apt to have an emotional reaction to, which often has the side effect of amplifying fake news and extremism.
Mary Anne Franks, a professor of law at the University of Miami, said there is “simply no responsible way to moderate a true live streaming service.” Facebook has always known the service has the potential to “encourage and amplify the worst of humanity, and it must confront the fact that it has blood on its hands,” she said.