20 Civil Rights Groups Say Biden Admin’s AI Bias Plan Is Insufficient

*|MC:SUBJECT|*

For Immediate Release


20 Civil Rights Groups Say Biden Admin’s AI Bias Plan Is Insufficient

(New York, NY, 8/26/2021) – Today, 20 civil rights groups, led by the Surveillance Technology Oversight Project (S.T.O.P.), a New York-based privacy group, submitted a comment to the National Institute of Standards & Technology (NIST), saying that the Institute’s proposal for managing artificial intelligence bias was woefully insufficient. The groups criticized NIST’s plan as “dangerously idealistic” for failing to recognize the full impact of systemic bias on AI. The groups also objected to NIST’s focus on a small number of technical sources of AI bias, noting that even “unbiased” algorithms have discriminated against low-income and BIPOC communities.

SEE: Organizational Comment To NIST
https://www.stopspying.org/s/Comment-of-20-Organizations-In-Response-To-NIST-SP-1270.pdf

Draft NIST Proposal for Identifying and Managing Bias in Artificial Intelligence
https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.1270-draft.pdf

“NIST can’t fix AI-bias if they ignore systemic racism,” said Surveillance Technology Oversight Project Executive Director Albert Fox Cahn. “NIST makes the exact same mistakes as the tech companies they’re trying to rein in. They believe that if we “fix” the algorithms, that we can fix these systems’ real-world impact. But we know that’s note true. The drivers of bias in facial recognition, social media monitoring, and other forms of AI go far beyond technical issues like training data selection. Bias goes to the heart of how government agencies use the AI tools they purchase."

Last month, the civil rights group released the report Scan City: A Decade of NYPD Facial Recognition Abuse. There, the group noted numerous non-technical drivers of AI bias, such as police officers’ routine alteration of suspects’ photos prior to facial recognition analysis.

SEE: MIT Technology Review – Predictive policing is still racist – whatever data it uses
https://www.technologyreview.com/2021/02/05/1017560/predictive-policing-racist-algorithmic-bias-data-crime-predpol/

Research Report – Scan City: A Decade of NYPD Facial Recognition Abuse
https://www.stopspying.org/scan-city

The Surveillance Technology Oversight Project is a non-profit advocacy organization and legal services provider. S.T.O.P. litigates and advocates for privacy, fighting excessive local and state-level surveillance. Our work highlights the discriminatory impact of surveillance on Muslim Americans, immigrants, and communities of color.
 

-- END --

CONTACT: S.T.O.P. Executive Director Albert Fox Cahn;  .
Copyright © 2019 Surveillance Technology Oversight Project, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.
 
PressWilliam Owen