【Watch Carnal Cravings (2006)】
We all knew facial-recognition technology was flawed,Watch Carnal Cravings (2006) just perhaps not this flawed.
A new study from the National Institute of Standards and Technology, published on Dec. 19, lays out in painstaking detail how facial-recognition tech misidentifies the elderly, young, women, and people of color at rates higher than that of white men. In other words, more at risk populations are also the ones more likely to suffer false matches and any associated legal troubles that follow.
Just how bad is it? Let's let the NIST study authors explain.
You May Also Like
"We found false positives to be higher in women than men, and this is consistent across algorithms and datasets," they wrote. "We found elevated false positives in the elderly and in children; the effects were larger in the oldest and youngest, and smallest in middle-aged adults."
And that's not all. "With mugshot images," the authors continued, "the highest false positives are in American Indians, with elevated rates in African American and Asian populations."
Why does this matter? Well, law enforcement uses the technology, and as such false positives can lead directly to mistaken arrests and harassment.
This study, which claims "empirical evidence" for its findings, is sure to add support to lawmakers' calls to ban the controversial tech.
"We have started to sound the alarm on the way facial recognition technology is expanding in concerning [ways]," wrote congresswoman Alexandria Ocasio-Cortez in July. "From the FBI to ICE to Amazon, the bar for consent and civil liberties protection is repeatedly violated, and on top of it all has a disproportionate racial impact, too."
She now has additional evidence to back up that latter claim.
SEE ALSO: Here's why San Francisco's vote to ban facial-recognition tech mattersImportantly, the congresswoman isn't alone in her concern. In a statement published by the Washington Post, Senator Ron Wyden reacted to the NIST findings by stating that "algorithms often carry all the biases and failures of human employees, but with even less judgment."
A growing number of cities, including San Francisco and Berkeley, recently moved to ban some government use of the tech. Perhaps this study will encourage others to follow suit.
Topics Facial Recognition
Search
Categories
Latest Posts
Best speaker deal: Save $30 on the JBL Clip 5
2025-06-26 17:19William Faulkner, a Fine Gentleman by Sadie Stein
2025-06-26 16:0620/20 by Sadie Stein
2025-06-26 15:37Always Remember by Sadie Stein
2025-06-26 15:19SpaceX's Starlink satellite launch in pictures
2025-06-26 15:13Popular Posts
Contingent No More
2025-06-26 17:36Tonight! The Paris Review on Charlie Rose by Justin Alvarez
2025-06-26 17:02Grrrl, Collected by Lisa Darms
2025-06-26 16:56When Winning Is Everything by Adam Sobsey
2025-06-26 15:32Analyzing Graphics Card Pricing: October 2018
2025-06-26 15:23Featured Posts
Best Max streaming deal: Save 20% on annual subscriptions
2025-06-26 17:09How to Prepare for the Past by Brian Cullman
2025-06-26 16:25Unbroken Crayon by Alia Akkam
2025-06-26 15:45Long Pregnant Summer: Kim, Kate, and Stella by Sarah Funke Butler
2025-06-26 15:44NYT mini crossword answers for April 24, 2025
2025-06-26 15:03Popular Articles
Best roborock deal: Save $400 on Q5 Pro+ Robot Vacuum and Mop
2025-06-26 17:11Short Story by Sadie Stein
2025-06-26 16:39When Winning Is Everything by Adam Sobsey
2025-06-26 16:38Watch The Paris Review on Charlie Rose
2025-06-26 16:37The State of PC Gaming in 2016
2025-06-26 14:59Newsletter
Subscribe to our newsletter for the latest updates.
Comments (2338)
Dream Information Network
The State of PC Gaming in 2016
2025-06-26 17:29Image Information Network
Heartless Thief Steals Books on Bikes Bicycle, and Other News by Sadie Stein
2025-06-26 17:25Sharing Information Network
Flannery O’Connor’s Peacocks, and Other News by Sadie Stein
2025-06-26 17:06Miracle Information Network
When Winning Is Everything by Adam Sobsey
2025-06-26 15:47Sky Information Network
The internet is talking like Kevin from 'The Office' now
2025-06-26 15:37