Cameras, AI on Self-Driving Cars May Miss Darker-Skinned Faces

Cameras, AI on Self-Driving Cars May Miss Darker-Skinned Faces

Pedestrians with darker skin tones are at risk of not being picked up by the AI recognition systems in self-driving cars. That’s the conclusion — and the safety implication — from a recent study conducted by Georgia Tech. The darker the skin tone, the more trouble pedestrian recognition software had in recognizing a pedestrian as such.

This follows other research that found short pedestrians may be less likely to be recognized by autonomous vehicles than people closer to normal height. In non-car situations, AI facial recognizers had more trouble determining the gender of people with darker skin tones. Separately, face recognition software has had more trouble determining gender when parsing images known (to the researchers) to be of women. In the past, Google sometimes ID’d darker human faces as being chimpanzees or gorillas. Chinese users of the iPhone X have said the phone’s face recognition software can’t tell them apart.

Cameras, AI on Self-Driving Cars May Miss Darker-Skinned Faces

As for the Georgia Tech study into predictive inequity in recognizing pedestrians, the researchers had observed higher error rates for some demographics. They set up the study using eight different AI systems and worked with a group of images of people of varying skin tones. The images were divided into lighter-skin and darker-skin categories using the 1975 Fitzpatrick Scale, which establishes six degrees of skin tone (1 is lightest, 6 is darkest). It’s used for a wide range of studies, such as how much exposure to UV light causes sunburn.

The researchers said they began the study after casual observations that sensors, cameras, and software did a better job detecting people with lighter skin tones. They then ran lab-based tests rather than out in the real world. That is, researchers didn’t recruit volunteers representing the six skin tones, have them go onto the streets of Atlanta and cross in front of autonomous vehicles to see how many made it safely to the other side. Instead, the images were fed to the software behind eight pedestrian/facial recognizers with self-driving applications. This was not a test of lidar, the optical scanners that create detailed maps of the nearby surroundings and animate objects such as people and animals.

Lidar scanner.
Lidar scanner.

The results: The recognition bias in favor of lighter-toned images continued. The recognition accuracy for images of darker skinned people was 5 percent lower than for images of lighter-skinned people.

Here’s the backstory on skin-color and why the study was conducted. According to authors Benjamin Wilson, Judy Hoffman, and Jamie Morgenstern:

Early warnings that facial recognition might have higher accuracy on white men showed that this problem might be somewhat mitigated by training systems separately for different demographic groups. Nevertheless, recent, state-of-the art systems designed by many major tech conglomerates have continued to face scrutiny for the behavior of their facial recognition systems. Commercial gender prediction software has been shown to have much worse accuracy on women with Fitzpatrick skin types 4-6 compared to other groups; this work inspired our use of the Fitzpatrick skin scale to categorize pedestrians. The ACLU found that Amazon’s facial recognition system incorrectly matched a number of darker-skinned members of Congress to mugshots from arrests across the country.

Critics of the study — this being academia, that’s a given — harrumphed that the Georgia Tech researchers didn’t use datasets (images and conditions) commonly used by developers of autonomous vehicles. Kate Crawford, a professor at NYU studying the social implications of AI who was not involved in the Georgia Tech study, shot back on Twitter: “In an ideal world, academics would be testing the actual models and training sets used by autonomous car manufacturers. But given those [datasets] are never made available (a problem in itself), papers like these offer strong insights into very real risks.”

The Georgia Tech study shows there’s concern that artificial intelligence as we now know it still makes mistakes in recognizing people. White men of normal height fare pretty well with AI recognizers. Others who are darker skinned, female, shorter, or not of Caucasian background continue to face challenges.

Continue reading

Tesla Built a Supercomputer to Develop Camera-Only Self-Driving Tech
Tesla Built a Supercomputer to Develop Camera-Only Self-Driving Tech

Tesla is talking about what it sees as the next leap in autonomous driving that could do away with lidar and radar, leaving self-driving cars to get around with regular optical cameras only.

Tesla Rolls Out $200 Monthly Subscription for ‘Full Self-Driving’
Tesla Rolls Out $200 Monthly Subscription for ‘Full Self-Driving’

Some vague language on Tesla's part means that vehicles marketed as having full Autopilot capabilities might need an additional $1,500 hardware upgrade to use FSD.

Elon Musk Says Tesla Full Self-Driving Is Almost Ready for Release
Elon Musk Says Tesla Full Self-Driving Is Almost Ready for Release

Musk says the company is finally set to release the Full Self-Driving (FSD) update for compatible Tesla vehicles in the US, but this isn't the first time he's said that. If it's true this time, the software will begin rolling out at midnight on Sept. 10 to beta testers. If all goes as planned, the final launch could be just a few weeks later.

Walmart is Using Self-Driving Trucks in a 7 Mile Delivery Loop
Walmart is Using Self-Driving Trucks in a 7 Mile Delivery Loop

It's the first time ever that an autonomous vehicle company has removed drivers from a delivery route's middle mile.