Another Tesla Allegedly Collides With Emergency Vehicle in Autopilot Mode

Another Tesla Allegedly Collides With Emergency Vehicle in Autopilot Mode

Stop me if you’ve heard this one: A Tesla owner in Orlando collided with a parked police vehicle early on Saturday morning. The cruiser was on the side of the road, lights flashing while the officer helped the driver of a stranded Mercedes. According to the Tesla driver, her car was in Autopilot mode, but it failed to see the police vehicle and ended up hitting both the cruiser and the Mercedes. An investigation is underway, but this doesn’t look great as Tesla faces questions from government regulators about this very issue.

Tesla rolled out Autopilot to Model S owners in 2014, and the feature has since become a major selling point of its vehicles. Even the cheap Model 3 and Model Y have Autopilot functionality, but you can also upgrade to “Full Self-Driving,” which adds features like lane changing, summon, and traffic sign identification. Even the regular Autopilot implementation lets you set a destination and let the car handle highway driving.

Following the accident, the driver told authorities that the vehicle was “in Autopilot.” Police are investigating, and Tesla will probably have its say as well. But even if this is true, the driver is going to be held responsible. Tesla’s self-driving technology is not true self-driving. It’s what’s known in the industry as SAE level 2 automation. That means that the car can control speed and lane position at the same time, and can also take corrective actions like applying the brake. However, the driver must remain aware of their surroundings to take over from the car at a moment’s notice, and that seems to be the issue.

Another Tesla Allegedly Collides With Emergency Vehicle in Autopilot Mode

A person driving along and watching the road will always notice emergency vehicles with flashing lights, so you would expect a computer vision system that is always watching from multiple angles could do the same. We’re starting to understand that these systems are still not perfect, but they’re so good so much of the time that people become complacent. Humans are simply not as good at “monitoring” activities as we are at actively taking part in things. When someone does need to take over from Autopilot, they might not realize until it’s too late.

Self-driving cars don’t become truly self-driving until SEA level 3. At that point, the car should be able to handle an entire trip under certain conditions, and it will proactively alert the driver if they need to take over. While Tesla does not meet this threshold, the marketing and features (they even call it “Full Self-Driving Capability”) can make people feel like they’re getting a true autonomous experience. Perhaps that’s a dangerous illusion.

Continue reading

Intel Sues Former Employee For Alleged Theft of Xeon Data
Intel Sues Former Employee For Alleged Theft of Xeon Data

Intel has alleged trade secret theft by a former employee.

AMD’s Ryzen CPUs, Chipsets Allegedly Contain Serious Security Flaws
AMD’s Ryzen CPUs, Chipsets Allegedly Contain Serious Security Flaws

AMD has been hit with 13 serious allegations of bugs across its Ryzen processor families and its chipsets.

Nvidia GTX 1180 Specs Allegedly Leaked
Nvidia GTX 1180 Specs Allegedly Leaked

The GTX 1180's specs have allegedly leaked — we sort through the details.

FBI Allegedly Ran Sting Operation on Huawei at CES
FBI Allegedly Ran Sting Operation on Huawei at CES

The Justice Department investigation targeting Huawei for violating Iran sanctions is potentially serious, but the FBI is also looking into whether the company tried to steal a diamond screen coating technology from a US startup.