By Sarah Roth and Evan Enzer
It took decades of challenging systemic barriers and exclusion for disabled Americans to secure their right to exist and participate fully in our society. Thirty-three years ago, Congress passed the American with Disabilities Act to protect people with disabilities from ableist discrimination. At its core, the law was intended to facilitate access to transportation and public accommodations, like restaurants and workplaces. Yet, people with disabilities still face a myriad of challenges that infringe on their right to survive and thrive in our society, and mass surveillance technology is only making the problem worse.
Data-driven surveillance tech has been incorporated into nearly every sector of public life, popping up in our shopping centers, recreational hubs and the transit lines that connect them. City officials and corporate representatives promised these tools would make all our lives easier and our ventures into the public sphere safer. However, countless studies have shown that these technologies are inherently biased and discriminatory because they are not being built, or used, with accessibility in mind.
Biometric monitoring software is programmed to compare behaviors using a baseline embedded within its design that does not account for the diversity and nuance of disabilities. Assumptions rooted in ableism about what disabilities can and should look like are entrenched in these systems, putting people with disabilities at risk for being singled out or experiencing dehumanizing punishment for simply existing as themselves.
Take, for instance, Amazon’s Flex program, which uses an app to track Amazon delivery drivers with the intent of either incentivizing or penalizing them based on their efficiency. This discounts the experiences of workers with disabilities, and Amazon’s algorithmic management system has been reported to fire the slowest people — regardless of the individual’s disability or access needs.
The expansion of biometric technology is also a threat to the health and safety of people with disabilities. In March, New York City Mayor Eric Adams proposed that stores ban patrons who refuse to remove their masks and expose their faces to facial recognition equipped surveillance cameras. This policy discriminates against immunocompromised individuals, putting those who rely on the lifesaving health benefits of masks at risk. Moreover, by singling out shoppers who cannot remove their masks, Mayor Adams is setting a standard of denying entry to and even criminalizing those who are unable to adjust their behavior and appearance to the demands of the surveillance state.
Mass surveillance technologies forced onto people with disabilities threatens the very meaning of accessibility and risks excluding them from society, misinterpreting their behavior as dangerous, and robbing them of their autonomy. As a country, we need to do more to ensure that technological change does not come at the expense of disability rights and justice.
Lawmakers have been slow to act, as they often are with technological change, but there is some progress. The New York City Council recently introduced a landmark ordinance that would ban facial recognition in places of public accommodation, ensuring that biased technology does not threaten or exclude disabled people from public life.
As these surveillance tools continue to become more prominent and inescapable, the urgency with which lawmakers need to act cannot be overstated.
Sarah Roth is a development and communications fellow at the Surveillance Technology Oversight Project (S.T.O.P.).
Evan Enzer is a privacy professional and legal fellow at S.T.O.P.