Both Uber and Tesla suffered major self-driving setbacks in recent weeks. Both companies were involved in fatal traffic accidents — Uber, when its vehicle struck a pedestrian, and Tesla after a Model X driver, Walter Huang, died when his vehicle struck a concrete median in Autopilot mode. Huang’s family has hired the law firm Minami Tamaki to investigate the situation and has claimed a preliminary report shows the Autopilot system is clearly deficient. Tesla, meanwhile, continues to put blame entirely on the driver.
“(Our) preliminary review indicates that the navigation system of the Tesla may have misread the lane lines on the roadway, failed to detect the concrete median, failed to brake the car, and drove the car into the median,” Minami said.
The family has claimed that Huang complained about issues with Autopilot in that specific area. Tesla argues that these very points undermine any argument that Autopilot was to blame for Huang’s death. The company released the following statement to ABC News:
According to telemetry from the Model X, the driver’s hands were not on the wheel for six seconds preceding his death, despite multiple warnings to engage with it.
The problem here goes deeper than the question of whether Tesla’s Autopilot is linked to Huang’s death. Proponents of self-driving cars have often pointed to the fact that tens of thousands of people die in automotive accidents every year. Human-controlled driving isn’t pretty, and your average human isn’t particularly good at it. Self-driving vehicles could very well improve on a bad situation.
But not much ink gets spilled on the inevitable transition periods, during which self-driving cars aren’t going to be as good at driving as their human counterparts. It’s easy to explain Level 5 self-driving to people, because that’s when the car is going to be capable of doing everything. The lower levels, which give the vehicle partial control in certain circumstances, can only function properly if the driver is completely aware of their limitations and capabilities.
A recent video shot by someone from the Chicago area attempted to replicate the California accident and very nearly succeeded. The vehicle heads directly towards a slab of concrete before the driver takes the wheel again. In the absence of any indication that Huang was attempting to commit suicide or suffered a heart attack, stroke, or equivalent event, we can at least conclude his death was unintentional and that he took his hands off the wheel because he believed the Tesla Autopilot would accurately guide the vehicle. And while the Uber crash from last month isn’t the primary focus of this story, we can also assume that the driver in that incident had no intention of killing a pedestrian.
The fundamental problem with self-driving vehicles that aren’t capable of full, robust, Level 5 performance (and none of them currently are) is that at some point, the vehicle is going to decide it can’t handle road conditions. The human driver may or may not be aware that decision has been made. Even if the driver is aware of it, he or she might not be able to react quickly enough to prevent an accident. And the smarter these systems become, the greater the chance that the car might make one decision to evade a catastrophe while the driver attempts to take a different, confounding action.
Self-driving cars really could revolutionize transport long-term. They could change the dynamics of vehicle ownership, help older people retain self-sufficiency, and slash the rate of death associated with drunk driving, distracted driving, and exhausted driving. The fact that some of these gains could take several decades to fully arrive given how long it takes vehicle fleets to turn over is no reason not to pursue them. But uncertainty around self-driving vehicle intelligence and operational characteristics is still a problem today and it’s going to be a problem for the foreseeable future. The liability questions aren’t going to go away any time soon.
The NTSB has revoked Tesla’s status as a party to the investigation of the crash. In order to operate alongside the agency, Tesla is required to respect the confidentiality of the investigation. In taking the position that Huang was solely responsible for the incident, Tesla broke that requirement. The NTSB has a less rosy view of Autopilot’s current functionality than Tesla does.
Driver in Latest Tesla Crash Says Autopilot Was Engaged
Tesla is looking at another PR nightmare as the driver of a car that smashed into a parked fire truck says her Model S was in Autopilot mode.
Tesla Promises OTA Update to Fix Model 3 Autopilot After Breaking It With an OTA
An OTA update for the lower-cost Model 3 seems to have broken Autopilot for many users.
Nvidia Takes Assisted Driving to Market With DRIVE AutoPilot
With Level 5 Autonomy further away, Nvidia has taken the practical step of targeting a version of its DRIVE hardware and software at the rapidly-growing market for driver assistance systems.