Uber Self-Driving Car Had 6 Seconds to Avoid Fatal Pedestrian Crash

Uber Self-Driving Car Had 6 Seconds to Avoid Fatal Pedestrian Crash

Last month, an Uber self-driving car struck and killed a woman crossing the road. Since the event, there’s been an ongoing discussion about the safety of self-driving cars, larger questions about how accurately they can perceive obstructions in the road, and a related debate about the reliability of features like Tesla’s Autopilot. The National Transportation Safety Board has issued its own preliminary report on the crash, and the early data isn’t great for Uber.

According to the NTSB, Uber had six seconds to respond to the pedestrian after the vehicle detected her in its path. Furthermore, Uber explicitly disengaged the safety features of the Volvo XC90, including an emergency collision avoidance capability and automatic emergency braking. At 1.3 seconds before impact, the car determined an automatic emergency break should have been triggered. Here’s the report:

Data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

This Uber self-driving system data playback from the fatal, March 18, 2018, crash of an Uber Technologies, Inc., test vehicle in Tempe, Arizona, shows when, at 1.3 seconds before impact, the system determined emergency braking was needed to mitigate a collision. The yellow bands depict meters ahead of the vehicle, the orange lines show the center of mapped travel lanes, the purple area shows the path of the vehicle and the green line depicts the center of that path.
This Uber self-driving system data playback from the fatal, March 18, 2018, crash of an Uber Technologies, Inc., test vehicle in Tempe, Arizona, shows when, at 1.3 seconds before impact, the system determined emergency braking was needed to mitigate a collision. The yellow bands depict meters ahead of the vehicle, the orange lines show the center of mapped travel lanes, the purple area shows the path of the vehicle and the green line depicts the center of that path.

This report is preliminary and therefore does not contain a formal finding of fact concerning the cause of the crash, but it generally confirms what’s been suspected since the event. The vehicle “saw” the pedestrian, but did not properly classify her or notify the driver to react to her presence. The emergency braking system built by Volvo attempted to trigger emergency braking 1.3 seconds before the crash, unlike the driver, who began to brake less than a second before striking the pedestrian. Whether an extra half-second of braking, combined with potential collision avoidance maneuvers, would have saved the life of the woman in question is obviously unknown. But Uber’s system, in this case, did not deliver the redundancy or sophisticated avoidance capability that self-driving cars are broadly expected to provide before they enter service.

In the crash video released after the event, the driver can be seen glancing down multiple times, including immediately before impact. According to her testimony, she was specifically monitoring the self-driving interface at these points and her business and personal cell phones were not in-use. If true, this also speaks to an underlying concern about self-driving cars: If vehicle operators are also checking readouts for additional information, does this lead to distraction and could it increase accidents?

These sorts of questions aren’t going to stop, despite pushback from certain individuals who believe self-driving systems should be presented as a unilateral, unquestioned good. There are very real questions about how self-driving cars report what they “see,” whether drivers properly understand the limits of these systems, and how they interact with the conventional driving experience before we’ve reached Level 5 autonomy. The preliminary NTSB report demonstrates we’ve still got a long way to go, even if it refrains from a formal finding of fault (that determination will be reserved for the final report). Uber has canceled its Arizona self-driving program and announced it will focus on work in other markets for now as a result of the event.

Continue reading

Man Builds Custom 8-Bit CPU From Scratch

One enterprising individual has built his own 8-bit CPU from scratch, on solderless breadboards — and learned to program it, too.

Elon Musk Explains Why the Falcon Heavy Center Core Crashed

While SpaceX was able to recover two of the three first stage boosters, the center core failed to land on the drone ship.

Blizzard Drops New Warcraft III Patch, Tourney, as Rumors of a Remaster Spread

Blizzard has just issued a patch for its 16-year old game, Warcraft III, along with a major Warcraft III tourney. Could a full remaster be on the way?

Astronomers Use Spacecraft, Distant Star to Study Neptune’s Largest Moon

Thanks to the Gaia spacecraft, researchers managed to capture a "central flash" from Triton, carrying important hints about conditions on the moon.