Uber doesn’t have the greatest reputation when it comes to self-driving car safety, having been unceremoniously kicked out of San Francisco in 2016, and scaling back its tests in 2017 after a crash in Arizona. In the most recent incident, an Uber test vehicle, in autonomous mode with a safety driver behind the wheel, struck and killed a pedestrian in Tempe, Ariz., marking what seems to be the first known instance of that happening. The incident was reported at 10pm, so it was at night.
There is not nearly enough information to draw conclusions. But the known facts that the car was in autonomous mode and struck a pedestrian are damaging to Uber. They also don’t help the case for self-driving cars in the minds of the public. Uber has halted all of its self-driving tests on public roads in response, and the NTSB is sending a team to investigate. Supporters of deploying autonomous vehicles will point out that mile-for-mile they have actually been relatively safe (at least with a safety driver behind the wheel).
However, that won’t stop a frenzy of “I told you so” comments from naysayers. Before we rush to judgment, though, there are some key facts that need to be sorted out. In the case of the man killed while driving his Tesla in autonomous mode, it was months before all the facts were in, and there are still some open questions. Hopefully this incident will get clarified sooner than that, but here are the questions we need answered first:
What About the Vehicle Safety Systems?
There is a great deal of confusion about the role of various automated systems in a car. Several times, including the infamous Tesla fatality in Florida, “autonomous” features have been blamed. However, both the Tesla and likely the Uber car in this case, are equipped with much more standard collision avoidance systems that are designed to brake before they hit something. So clearly the Automatic Emergency Braking (AEB) system needs to be looked at in this case.
How Did the Safety Driver React?
I really don’t envy the job of being a safety driver. You need to sit behind the wheel ready to take over mile after mile, but might spend long periods of time doing nothing. Then you might only have a second or two to suddenly re-engage with the vehicle and take corrective action. This issue is part of why some like Google and Cruise have argued that sometimes-automated vehicles aren’t really the way to go, and that we need complete Level 5 autonomy (where the car doesn’t ever need a human) to ensure safety. Of course, it needs to really work!
Along with how the particular safety driver involved behaved in this situation, it’ll be important to look at Uber’s policies for training and evaluating its drivers, as well as the rules they provide for safe operation of the vehicle.
What Role Did the Pedestrian Play?
Is It Self-Driving or Is It Uber?
Any crash involving a potentially-autonomous vehicle generates a lot of interest and early headlines, along with Twitter feeds full of experts theorizing on what happened. But in this case, like the others, we’ll need to get some key questions answered before any real conclusions can be drawn. Some of those conclusions may change the way we regulate the testing and deployment of self-driving vehicles. If, for example, there are issues in Uber’s policies or procedures that contributed to the crash, then they’ll likely result in tighter regulation and enforcement. Since self-driving cars are heavily instrumented, including carrying multiple cameras, hopefully there will be enough information to get to the bottom of the incident fairly quickly.