Self-driving cars have claimed their first pedestrian fatality, a woman in Tempe, Arizona, who was struck and killed by an Uber vehicle travelling in autonomous mode. Weren’t self-driving cars supposed to be safer than those piloted by fallible humans?
And who says they aren’t? As many on social media rushed to point out, more than 37,000 people were killed by human-piloted vehicles in 2016. Compared with that, one pedestrian fatality, however sad, looks OK.
This argument is appealing. Unfortunately, it’s wrong.
Motor vehicle fatalities are measured in terms of “vehicle miles travelled”, which is just what it sounds like. In 2016, there were 1.18 fatalities for every 100 million miles that Americans drove. Since Americans drove nearly 3.2 trillion miles that year, that still added up to tens of thousands of deaths.
To know whether self-driving cars are safer than the traditional kind, you’d have to know how many miles they travelled before incurring this first fatality. And the answer is “fewer than 100 million” — a lot fewer. Waymo, the industry leader, recently reported logging its four millionth mile of road travel, with much of that in Western states that offer unusually favuorable driving conditions. Uber just reached two million miles with its autonomous programme. Other companies are working on fully autonomous systems, but adding them all together couldn’t get us anywhere close to 100 million. (The numbers go up if you add Tesla’s autopilot, but that system has more limited capabilities, and fatality statistics don’t necessarily get any clearer — or more favourable — if you do.)
One fatality at these numbers of road-miles driven does not suggest, to put it mildly, a safety improvement over humans. It’s more like dramatic step backward, or if you like, a high-speed reverse.
Which is not to say that we should pull the plug on autonomous driving. For one thing, regular old-fashioned cars were none too safe when they first arrived on American roads. Given their other benefits, it’s worth working for those improvements, regardless of whether these cars are currently less safe than human drivers. Think of all the elderly forced to leave their homes when they can no longer drive, the people whose disabilities make it hard or impossible for them to operate a car, the bar-room drunks who get behind the wheel because there’s no easy way to get home. Autonomous vehicles promise to solve these major problems; we should pursue that promise, even at the temporary cost of some road safety now.
Especially since it’s not clear that there is actually a cost in road safety. The glib insistence that self-driving cars are safer than human drivers is not well-founded — but neither is a counter-reaction that insists that they’re obviously much more dangerous.
We won’t know how dangerous self-driving cars are compared with human drivers until they’ve driven billions more miles. At the moment, we just know that they can kill people, not how often they will. And that’s a possibility that advocates for self-driving cars should have prepared the public for better than they have.
Enthusiasts for autonomous vehicles have been a little too quick to respond to safety concerns by pointing out how many accidents human drivers have — or by noting that self-driving cars don’t text, or drive drunk, or fall asleep at the wheel. All true, but as they drive more and more miles, we may discover that they have problems humans don’t.
Uber has already pulled its autonomous cars off the road in response to this tragedy. If it hadn’t, it seems likely that the public pressure to do so would have been deafening. The company — and advocates of autonomous vehicles more generally — will need to put in a lot of work over the next few months restoring public faith in this technology.
Luckily, software systems can be reengineered; even if self-driving cars aren’t currently safer than a human driver, there’s good reason to expect that one day they will be. But we’ve done that future no service by talking as if it had definitely already arrived. If we tell people that self-driving cars are perfectly safe, and that turns out not to be true, the backlash could drive these vehicles off the road before they’re able to deliver on their promise.
— Washington Post
Megan McArdle is a columnist. She writes for Bloomberg, Newsweek, the Atlantic and the Economist.