Federal study confirms facial recognition is a biased mess

False positives abound.
 By 
Jack Morse
 on 
Federal study confirms facial recognition is a biased mess
It's a match. Credit: MALTE MUELLER / GETTY

We all knew facial-recognition technology was flawed, just perhaps not this flawed.

A new study from the National Institute of Standards and Technology, published on Dec. 19, lays out in painstaking detail how facial-recognition tech misidentifies the elderly, young, women, and people of color at rates higher than that of white men. In other words, more at risk populations are also the ones more likely to suffer false matches and any associated legal troubles that follow.

Just how bad is it? Let's let the NIST study authors explain.


You May Also Like

"We found false positives to be higher in women than men, and this is consistent across algorithms and datasets," they wrote. "We found elevated false positives in the elderly and in children; the effects were larger in the oldest and youngest, and smallest in middle-aged adults."

And that's not all. "With mugshot images," the authors continued, "the highest false positives are in American Indians, with elevated rates in African American and Asian populations."

Why does this matter? Well, law enforcement uses the technology, and as such false positives can lead directly to mistaken arrests and harassment.

This study, which claims "empirical evidence" for its findings, is sure to add support to lawmakers' calls to ban the controversial tech.

"We have started to sound the alarm on the way facial recognition technology is expanding in concerning [ways]," wrote congresswoman Alexandria Ocasio-Cortez in July. "From the FBI to ICE to Amazon, the bar for consent and civil liberties protection is repeatedly violated, and on top of it all has a disproportionate racial impact, too."

She now has additional evidence to back up that latter claim.

Importantly, the congresswoman isn't alone in her concern. In a statement published by the Washington Post, Senator Ron Wyden reacted to the NIST findings by stating that "algorithms often carry all the biases and failures of human employees, but with even less judgment."

A growing number of cities, including San Francisco and Berkeley, recently moved to ban some government use of the tech. Perhaps this study will encourage others to follow suit.

Mashable Image
Jack Morse

Professionally paranoid. Covering privacy, security, and all things cryptocurrency and blockchain from San Francisco.

Mashable Potato

Recommended For You
Hinge tests facial recognition scans in these countries
hinge logo on iphone

New Tinder users in the UK will now need to scan their faces
Tinder on app store appearing on iPhone

OpenAI may sell $300 smart speaker with camera — in 2027
Sam Altman speaking at a microphone

AI facial recognition led to a grandma being wrongly jailed
Clearview AI logo


More in Tech
How to watch Chelsea vs. Port Vale online for free
Alejandro Garnacho of Chelsea reacts

How to watch 'Wuthering Heights' at home: Margot Robbie and Jacob Elordi's controversial romance now streaming
Margot Robbie and Jacob Elordi embracing in still from "Wuthering Heights"

How to watch New York Islanders vs. Philadelphia Flyers online for free
Matthew Schaefer of the New York Islanders warms up

How to watch Mexico vs. Belgium online for free
Israel Reyes of Mexico reacts

How to watch Brazil vs. Croatia online for free
Vinicius Junior #10 of Brazil leaves

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone


What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!