Autonomous Vehicles (AVs) Study Exposes “significant fairness issues related to skin tone and age”

By B.N. Frank

Over the years,  numerous incidents, reports, and studies have justified Americans’ concerns and/or opposition to driverless aka autonomous vehicles (AVs) (see 1, 2, 3, 4, 5, 6, 7, 8).  Ditto on artificial intelligence (AI) technologies – particularly facial recognition (see 1, 2, 3).  Recently published research provides another reason to be worried about both.

From Biometric Update


Autonomous cars may have a person detection bias problem with complexions

Aug 25, 2023, 1:56 pm EDT | Masha Borak

Makers of self-driving cars have for years been fending off criticism about safety. Now, they may have to defend themselves against charges of AI bias.

A team of researchers from King’s College London and Peking University examined eight deep-learning pedestrian detectors used for research on driverless cars using four widely studied testing datasets. Their results show significant fairness issues related to skin tone and age.

Accuracy of detection for adults was 19.67 percent higher compared to children. The study also found that the systems were 7.52 percent better at detecting light-skin individuals than dark-skin ones in all image datasets.

Results also showed that detection performance involving the dark-skin group decreased under low-brightness and -contrast situations compared to the light-skin group: The miss rate difference between the dark- and light-skin subjects jumped to 9.68 percent at night.

“Fairness issues in autonomous driving systems, such as a higher accuracy in detecting pedestrians of white ethnicity compared to black ethnicity, can perpetuate discriminatory outcomes and unequal treatment based on race,” the researchers write.

The situation is hardly surprising, given the problems with face detection for different demographics experienced by some users of remote proctoring software.

The study finds some positives with only a 1.1 percent difference in detection accuracy when it comes to gender.

Fairness is an emerging domain within software testing. In the study, researchers added manually labeled demographic information to the four large-scale datasets used in the study, resulting in 8,311 images with over 16,000 gender labels, 20,000 age labels and 3,500 skin-tone labels.

Self-driving companies such as Waymo, a subsidiary of Alphabet, are already responding to the new study. In a statement to Gizmodo, Waymo said that the study doesn’t take into account all the other tools that its autonomous vehicles use to detect pedestrians.

“We don’t just use camera images to detect pedestrians,” says Waymo spokesperson Sandy Karp. “Instead, we tap into our full sensor suite – including our lidars and radars, not just cameras – to help us actively sense details in our surroundings in a way that would be difficult to do with cameras alone.”

Masha Borak is a technology journalist. Her work has appeared in Wired, Business Insider, Rest of World, and other media outlets. Previously she reported for the South China Morning Post in Hong Kong. Reach out to her on LinkedIn.


Activist Post reports regularly about AI, AVs, and other unsafe technologies.  For more information, visit our archives.

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, and What Really Happened.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Autonomous Vehicles (AVs) Study Exposes “significant fairness issues related to skin tone and age”"

Leave a comment