Why Self-Driving Cars Keep Running Into Things

Why Self-Driving Cars Keep Running Into Things

Most of us can drive a car, and many of us think we’re good at it. We probably learned when we were teenagers, and have taken the skill for granted since then — except, of course, when we’re silently cursing out another driver for cutting us off. As a result, we may not think about how complex the task is. Not just steering a vehicle along a path, but understanding and reacting to the hundreds of different objects in our environment.

This makes designing automated safety systems for cars that can act reliably in all conditions and not trigger at the wrong time a daunting technical challenge. It’s only been possible because the assumption has been that the driver is still in charge at all times, and the safety system is only there to try and assist as a last resort. But as we rush towards self-driving cars, that assumption is in question. We now have a variety of driver assist systems, of which Tesla’s Autopilot is the best known, that can take full control of your vehicle in many situations. However, as smart as they are, those automated systems can do some pretty dumb things if the driver isn’t paying enough attention — like smash into the back of parked emergency vehicles. We take a look at how some of the pieces of those systems work, and some of the reasons for accidents might be happening.

To Brake Or Not To Brake

Why Self-Driving Cars Keep Running Into Things

Toyota, for example, has had to recall several models of its AEB-equipped cars when some of them braked to a stop for harmless objects, such as metal plates on the roadway. In addition to having limitations when it comes to recognizing potential issues, AEB systems also can’t read your mind. Perhaps you’re speeding up a bit in your lane to make it easier to pass the car in front of you. If you stayed in a straight line, you’d hit it, but you know you’re going to steer out of the way. Your AEB system doesn’t know that, so it has to be quite conservative about when it decides a collision is really unavoidable, or it needs to guess. Volvo’s City Safety system, for example, can be over-ridden if the driver is applying acceleration. It assumes that the maneuver is deliberate and deactivates. That was cited as one of the possible causes of the much-publicized accident when a self-parking demo went wrong and caused injuries in 2015.

Drivers Don’t Understand Their Driver Assist Systems

Top-of-the-line cars have several different automated safety systems. Each one comes with its own set of capabilities and limitations. Most only work within a specific speed range, and will only be effective in particular situations or with particular objects. Plus, descriptions of the systems tend to be swathed in legalese, so it can be hard to figure out exactly what you can expect them to do, and what their practical limitations are.

Even Volvo’s impressive City Safety system comes with multiple screens of warnings about its limitations. Beyond that, you still need to read additional pages of information to find out that it will only detect people who are at least 32 inches tall, and that it will not detect pedestrians carrying large packages. And never mind in the dark or in tunnels.
Even Volvo’s impressive City Safety system comes with multiple screens of warnings about its limitations. Beyond that, you still need to read additional pages of information to find out that it will only detect people who are at least 32 inches tall, and that it will not detect pedestrians carrying large packages. And never mind in the dark or in tunnels.

The result, as has been verified by researchers surveying car owners, is that most people don’t have a detailed understanding of what their vehicle safety systems can and can’t do. That’s okay as long as the systems are really just an emergency backup, with the driver assuming personal and continuous responsibility for their car’s behavior. If they get into a situation that requires automated intervention, then that’s a “bonus” and is indeed why AEB systems are shown to save lives.

Cyclist detection also has plenty of limitations. Here we show the caveats on Volvo’s version as an example.
Cyclist detection also has plenty of limitations. Here we show the caveats on Volvo’s version as an example.

Sort-of-Self-Driving Cars: The Autopilot Problem

Why Self-Driving Cars Keep Running Into Things

In fairness, we don’t read headlines every time AEB or Autopilot saves a life, although there are certainly plenty of cases where they do. For example, here’s a dashcam video showing Tesla’s Autopilot actively steering to avoid a collision when it’s cut off by a utility truck:

What’s With Hitting Parked Emergency Vehicles?

At least three times this year, Tesla cars have plowed into parked emergency vehicles. In each case drivers have stated they were using Autopilot. Two of the emergency vehicles were fire trucks, and one was a police SUV. All were very visible. That doesn’t count the fatal accident when a Tesla in Autopilot hit a highway center divider at high speed. Tesla’s response has been pretty consistent, basically saying the driver is still in charge, is supposed to be paying attention, and so on. Okay, that aside, intuitively these would seem to be the easiest kinds of accidents to avoid. So what’s lacking?

Why Self-Driving Cars Keep Running Into Things

Second, in some cases the lane markings were hard to see or confusing. That appears to have been one element in the crash into a highway barrier. Tesla specifies that Autopilot should only be used when lane markings are clearly visible. Finally, stationary objects turn out to be a real issue for automated systems. There are so many of them alongside and above the roadway that need to be ignored that it is harder than you’d think to know which ones to pay attention to. It’s possible Tesla’s reliance on radar, instead of the more-expensive lidar used by most other self-driving cars, also makes it more difficult for its cars to distinguish between large vehicles and overhead road signs, for example.

What About Uber and the Pedestrian?

The woman and bicyle were certainly obvious enough that the Uber self-driving lidar and software should have noticed them and reacted
The woman and bicyle were certainly obvious enough that the Uber self-driving lidar and software should have noticed them and reacted

Of course, the worst possible situation is when it isn’t a thing that gets hit, but a person, possibly fatally. That’s what happened in the much-publicized fatal crash of an Uber self-driving test vehicle into a bicycle-walking pedestrian, which illustrates some of the issues faced by companies developing self-driving vehicles.

The reasons for the Uber crash include an interlinked set of policy decisions and their tragic consequences. Uber disabled the Volvo’s vaunted safety systems so they wouldn’t interfere with its autonomous mode software. That isn’t uncommon, but typically it is done when the company doing the testing believes it has something better. In this case, though, Uber’s answer was that the safety driver was to provide that function. However, Uber also fielded its cars with only a single safety driver, and then assigned that driver other tasks that distracted from paying attention to driving. The combination was definitely a recipe for trouble. So when Uber’s software got confused, variously identifying the woman pushing the bicycle as a pedestrian and a cyclist, not only did it not act in time, the safety driver didn’t become aware of her until it was too late for her to retake control of the vehicle and stop or steer out of the way.

None of this means that it was safe for the pedestrian to be wandering across the lanes of traffic where there was no crosswalk. But for autonomous vehicles to be accepted, they are going to need to be good at responding to situations like that.

Experts Disagree Over How Much Automation Is Good Enough

For this reason, some companies, like Alphabet’s Waymo and GM’s Cruise, don’t believe in partially autonomous vehicles, and are working on full Level 5 autonomy for their initial commercial offerings. That takes the human completely out of the loop. Of course, it is also a much harder problem to solve.

Realistically, the industry isn’t going to wait for full autonomy. Nearly every car company in the world is working hard on adding various automated assists to its cars as a way to stay competitive. On balance, most of them are likely to save lives, but we are also going to have to learn the best ways to ensure they are used wisely. Stay tuned for a boom in active driver monitoring systems, for example.

[Feature photo courtesy of Laguna Beach Police Department]

Continue reading

Chromebooks Gain Market Share as Education Goes Online
Chromebooks Gain Market Share as Education Goes Online

Chromebook sales have exploded in the pandemic, with sales up 90 percent and future growth expected. This poses some challenges to companies like Microsoft.

New Intel Rocket Lake Details: Backwards Compatible, Xe Graphics, Cypress Cove
New Intel Rocket Lake Details: Backwards Compatible, Xe Graphics, Cypress Cove

Intel has released a bit more information about Rocket Lake and its 10nm CPU that's been back-ported to 14nm.

Third-Party Repair Shops May Be Blocked From Servicing iPhone 12 Camera
Third-Party Repair Shops May Be Blocked From Servicing iPhone 12 Camera

According to a recent iFixit report, Apple's hostility to the right of repair has hit new heights with the iPhone 12 and iPhone 12 Pro.

AMD Is Hitting Market Share It Hasn’t Held in a Decade
AMD Is Hitting Market Share It Hasn’t Held in a Decade

AMD claims to have gained market share across desktop, laptop, and server this quarter.