Another Tesla Allegedly Collides With Emergency Vehicle in Autopilot Mode
Stop me if you’ve heard this one: A Tesla owner in Orlando collided with a parked police vehicle early on Saturday morning. The cruiser was on the side of the road, lights flashing while the officer helped the driver of a stranded Mercedes. According to the Tesla driver, her car was in Autopilot mode, but it failed to see the police vehicle and ended up hitting both the cruiser and the Mercedes. An investigation is underway, but this doesn’t look great as Tesla faces questions from government regulators about this very issue.
Tesla rolled out Autopilot to Model S owners in 2014, and the feature has since become a major selling point of its vehicles. Even the cheap Model 3 and Model Y have Autopilot functionality, but you can also upgrade to “Full Self-Driving,” which adds features like lane changing, summon, and traffic sign identification. Even the regular Autopilot implementation lets you set a destination and let the car handle highway driving.
Following the accident, the driver told authorities that the vehicle was “in Autopilot.” Police are investigating, and Tesla will probably have its say as well. But even if this is true, the driver is going to be held responsible. Tesla’s self-driving technology is not true self-driving. It’s what’s known in the industry as SAE level 2 automation. That means that the car can control speed and lane position at the same time, and can also take corrective actions like applying the brake. However, the driver must remain aware of their surroundings to take over from the car at a moment’s notice, and that seems to be the issue.
A person driving along and watching the road will always notice emergency vehicles with flashing lights, so you would expect a computer vision system that is always watching from multiple angles could do the same. We’re starting to understand that these systems are still not perfect, but they’re so good so much of the time that people become complacent. Humans are simply not as good at “monitoring” activities as we are at actively taking part in things. When someone does need to take over from Autopilot, they might not realize until it’s too late.
Self-driving cars don’t become truly self-driving until SEA level 3. At that point, the car should be able to handle an entire trip under certain conditions, and it will proactively alert the driver if they need to take over. While Tesla does not meet this threshold, the marketing and features (they even call it “Full Self-Driving Capability”) can make people feel like they’re getting a true autonomous experience. Perhaps that’s a dangerous illusion.
Continue reading
Protect Your Online Privacy With the 5 Best VPNs
Investing in a VPN is a smart choice right now, but the options are vast. To help narrow things down a bit, we've rounded up five of our very favorite consumer services.
RISC-V Tiptoes Towards Mainstream With SiFive Dev Board, High-Performance CPU
RISC V continues to make inroads across the market, this time with a cheaper and more fully-featured test motherboard.
The PlayStation 5 Will Only Be Available Online for Launch Day
The PlayStation 5 isn't going to be available in stores on launch day, and if you want to pick up an M.2 SSD to expand its storage, you'll have some time to figure out that purchase.
ARMing for War: New Cortex-A78C Will Challenge x86 in the Laptop Market
ARM took another step towards challenging x86 in its own right with the debut of the Cortex-A78C this week. The new chip packs up to eight "big" CPU cores and up to an 8MB L3 cache.