Followers of S.T.O.P.’s work know our concerns about AI-driven tools. Rather than remove human bias from the employment process, this technology instead drifts toward bias by default – penalizing resumes that contain the word “women,” tending to favor candidates with the name “Jared,” and learning other trivial and discriminatory indicators of success. This is because this AI is trained using data based on past racist and sexist hiring decisions.
And AI-driven hiring algorithms are just the tip of the iceberg. AI is also trained on other biased databases collected through discriminatory policing, driving invasive surveillance technologies like facial recognition, predictive policing, so-called gang databases, and much more. This dangerous technology is currently helping law enforcement make decisions about whom to surveil, where to patrol, and whom to arrest. In response, S.T.O.P. and our partners are campaigning to prohibit law enforcement from using these kinds of systems.
John Oliver is talking about AI bias, so come join the conversation! Your support can help us fight against this scourge of biased AI. Contributions made by our community of donors make it possible for us to continue to address these urgent issues through our litigation and legislative work. You can learn more about our campaigns and lawsuits by visiting our website.
|