Documentary Exposes How Facial Recognition Tech Doesn’t See Dark Faces Accurately

By B.N. Frank

What not to NOT love about Artificial Intelligence (AI)?

  • Millions of jobs being lost (see 1, 2)
  • Censorship, unwarranted surveillance, and other unethical and dangerous applications (see 1, 2, 3, 4, 5)
  • Inaccuracies that can lead to life-altering consequences

One documentary reveals more unscrupulous details:

CODED BIAS explores the fallout of MIT Media Lab researcher Joy Buolamwini’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us all.

Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, artificial intelligence is not neutral, and women are leading the charge to ensure our civil rights are protected.

The film’s website also provides a link with information and resources for taking action.

Activist Post reports regularly about unsafe technology.  For more information, visit our archives.

Become a Patron!

Subscribe to Activist Post for truth, peace, and freedom news. Send resources to the front lines of peace and freedom HERE! Follow us on Telegram, SoMee, HIVE, Flote, Minds, MeWe, Twitter, Gab and Ruqqus.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Documentary Exposes How Facial Recognition Tech Doesn’t See Dark Faces Accurately"

Leave a comment