Last month, an Uber self-driving car struck and killed a woman crossing the road. Since the event, there’s been an ongoing discussion about the safety of self-driving cars, larger questions about how accurately they can perceive obstructions in the road, and a related debate about the reliability of features like Tesla’s Autopilot. The National Transportation Safety Board has issued its own preliminary report on the crash, and the early data isn’t great for Uber.
According to the NTSB, Uber had six seconds to respond to the pedestrian after the vehicle detected her in its path. Furthermore, Uber explicitly disengaged the safety features of the Volvo XC90, including an emergency collision avoidance capability and automatic emergency braking. At 1.3 seconds before impact, the car determined an automatic emergency break should have been triggered. Here’s the report:
Data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
This report is preliminary and therefore does not contain a formal finding of fact concerning the cause of the crash, but it generally confirms what’s been suspected since the event. The vehicle “saw” the pedestrian, but did not properly classify her or notify the driver to react to her presence. The emergency braking system built by Volvo attempted to trigger emergency braking 1.3 seconds before the crash, unlike the driver, who began to brake less than a second before striking the pedestrian. Whether an extra half-second of braking, combined with potential collision avoidance maneuvers, would have saved the life of the woman in question is obviously unknown. But Uber’s system, in this case, did not deliver the redundancy or sophisticated avoidance capability that self-driving cars are broadly expected to provide before they enter service.
In the crash video released after the event, the driver can be seen glancing down multiple times, including immediately before impact. According to her testimony, she was specifically monitoring the self-driving interface at these points and her business and personal cell phones were not in-use. If true, this also speaks to an underlying concern about self-driving cars: If vehicle operators are also checking readouts for additional information, does this lead to distraction and could it increase accidents?
These sorts of questions aren’t going to stop, despite pushback from certain individuals who believe self-driving systems should be presented as a unilateral, unquestioned good. There are very real questions about how self-driving cars report what they “see,” whether drivers properly understand the limits of these systems, and how they interact with the conventional driving experience before we’ve reached Level 5 autonomy. The preliminary NTSB report demonstrates we’ve still got a long way to go, even if it refrains from a formal finding of fault (that determination will be reserved for the final report). Uber has canceled its Arizona self-driving program and announced it will focus on work in other markets for now as a result of the event.
Fatal Arizona Crash: Uber Car Saw Woman, Called It a False Positive
The lidar sensor almost certainly saw the pedestrian and bicycle. But the car's logic decided this particular group of moving pixels wasn't a hazard to the car, or the car to the pedestrian. Uber test cars remain off the road.
Tesla Battery Reignited Days After Fatal Model X Crash
Tesla is still on the defensive after an March Model X crash that claimed the life of the driver, and now a safety alert from the Mountain View fire chief on that incident has added fuel to the fire.
Report: Tesla Model X Accelerated Toward Barrier Before Fatal Crash
According to the NTSB, not only did the Tesla Autopilot steer into the concrete divider, it actually sped up.
Uber Driver in Fatal Self-Driving Car Crash Was Streaming Hulu
After collecting records in the course of its investigation, the police department now believes the driver was watching Hulu instead of paying attention to the road at the time of the incident.