Cameras, AI on Self-Driving Cars May Miss Darker-Skinned Faces

Cameras, AI on Self-Driving Cars May Miss Darker-Skinned Faces

Pedestrians with darker skin tones are at risk of not being picked up by the AI recognition systems in self-driving cars. That’s the conclusion — and the safety implication — from a recent study conducted by Georgia Tech. The darker the skin tone, the more trouble pedestrian recognition software had in recognizing a pedestrian as such.

This follows other research that found short pedestrians may be less likely to be recognized by autonomous vehicles than people closer to normal height. In non-car situations, AI facial recognizers had more trouble determining the gender of people with darker skin tones. Separately, face recognition software has had more trouble determining gender when parsing images known (to the researchers) to be of women. In the past, Google sometimes ID’d darker human faces as being chimpanzees or gorillas. Chinese users of the iPhone X have said the phone’s face recognition software can’t tell them apart.

Cameras, AI on Self-Driving Cars May Miss Darker-Skinned Faces

As for the Georgia Tech study into predictive inequity in recognizing pedestrians, the researchers had observed higher error rates for some demographics. They set up the study using eight different AI systems and worked with a group of images of people of varying skin tones. The images were divided into lighter-skin and darker-skin categories using the 1975 Fitzpatrick Scale, which establishes six degrees of skin tone (1 is lightest, 6 is darkest). It’s used for a wide range of studies, such as how much exposure to UV light causes sunburn.

The researchers said they began the study after casual observations that sensors, cameras, and software did a better job detecting people with lighter skin tones. They then ran lab-based tests rather than out in the real world. That is, researchers didn’t recruit volunteers representing the six skin tones, have them go onto the streets of Atlanta and cross in front of autonomous vehicles to see how many made it safely to the other side. Instead, the images were fed to the software behind eight pedestrian/facial recognizers with self-driving applications. This was not a test of lidar, the optical scanners that create detailed maps of the nearby surroundings and animate objects such as people and animals.

Lidar scanner.
Lidar scanner.

The results: The recognition bias in favor of lighter-toned images continued. The recognition accuracy for images of darker skinned people was 5 percent lower than for images of lighter-skinned people.

Here’s the backstory on skin-color and why the study was conducted. According to authors Benjamin Wilson, Judy Hoffman, and Jamie Morgenstern:

Early warnings that facial recognition might have higher accuracy on white men showed that this problem might be somewhat mitigated by training systems separately for different demographic groups. Nevertheless, recent, state-of-the art systems designed by many major tech conglomerates have continued to face scrutiny for the behavior of their facial recognition systems. Commercial gender prediction software has been shown to have much worse accuracy on women with Fitzpatrick skin types 4-6 compared to other groups; this work inspired our use of the Fitzpatrick skin scale to categorize pedestrians. The ACLU found that Amazon’s facial recognition system incorrectly matched a number of darker-skinned members of Congress to mugshots from arrests across the country.

Critics of the study — this being academia, that’s a given — harrumphed that the Georgia Tech researchers didn’t use datasets (images and conditions) commonly used by developers of autonomous vehicles. Kate Crawford, a professor at NYU studying the social implications of AI who was not involved in the Georgia Tech study, shot back on Twitter: “In an ideal world, academics would be testing the actual models and training sets used by autonomous car manufacturers. But given those [datasets] are never made available (a problem in itself), papers like these offer strong insights into very real risks.”

The Georgia Tech study shows there’s concern that artificial intelligence as we now know it still makes mistakes in recognizing people. White men of normal height fare pretty well with AI recognizers. Others who are darker skinned, female, shorter, or not of Caucasian background continue to face challenges.

Continue reading

Chromebooks Gain Market Share as Education Goes Online
Chromebooks Gain Market Share as Education Goes Online

Chromebook sales have exploded in the pandemic, with sales up 90 percent and future growth expected. This poses some challenges to companies like Microsoft.

MSI’s Nvidia RTX 3070 Gaming X Trio Review: 2080 Ti Performance, Pascal Pricing
MSI’s Nvidia RTX 3070 Gaming X Trio Review: 2080 Ti Performance, Pascal Pricing

Nvidia's new RTX 3070 is a fabulous GPU at a good price, and the MSI RTX 3070 Gaming X Trio shows it off well.

AMD May Allow Custom RX 6900 XT Cards, Launch Stock May Be Limited
AMD May Allow Custom RX 6900 XT Cards, Launch Stock May Be Limited

There are rumors that Nvidia may not be the only company facing production shortages this holiday season. High-end GPUs might just be very hard to find in general.

Third-Party Repair Shops May Be Blocked From Servicing iPhone 12 Camera
Third-Party Repair Shops May Be Blocked From Servicing iPhone 12 Camera

According to a recent iFixit report, Apple's hostility to the right of repair has hit new heights with the iPhone 12 and iPhone 12 Pro.