By Evan Enzer
We’ve seen governments use facial recognition to run circles around fundamental rights; now, landlords are bringing it into our homes.
“Please take a photo for facial recognition verification,” read a text message. My wife shouted from the next room immediately, “I just called the new apartment. They are sending a text; they said it would look sketchy.” I wasn’t happy. The leasing office could verify my identity in other ways. Why not just have a leasing agent look at my driver’s license? But we had spent hundreds of dollars submitting our application and deposit, I didn’t want to start our apartment search all over again.
The facial recognition system recognized me quickly, but my wife was a different story. Inaccuracy is common for facial recognition tech. Even the programs vendors claim are the “most accurate” are about ten percent worse at identifying women than men. The error rates are even worse for people of color. A landmark study found facial recognition can misidentify Black women 100 times more often than white men. For systems involving human review, facial recognition programs are wrong more than right. And in one particularly awful application, New York’s Metro Transportation Authority found facial recognition to be 100% inaccurate.
In our case, it took several attempts at identification and a call to the leasing office until the software finally registered my wife’s face, an easily avoidable nuisance but nothing life threatening. Others have not been so lucky. Facial recognition errors can lead to life-altering consequences. Two among the many examples: Texas denied a woman’s unemployment insurance application—making her rent impossible to afford; and New Jersey police arrested an innocent Black man—leaving him with an arrest record that could affect his housing opportunities and employment.
My new state’s laws hadn’t protected me from needless facial recognition. Where I live, in Texas, companies can’t collect your biometric identifiers without your consent. But the law doesn’t give you another option if you say no. The apartment manager asked for permission, but I had no choice other than to provide it. We had invested far too much time and money to change course now. There are even fewer restrictions on the use of biometric information in most of the U.S. Companies sell biometric data to marketers and governments without oversight. Then, state, federal, and private actors can use it to suppress our speech, prosecute our choices, and deter us from exercising our basic rights.
Rental applications are only one arena where facial recognition and cameras are invading our homes. Landlords and governments are installing biometric entry systems, gifting video doorbells to our neighbors, and installing CCTV in common areas. At least one city even installed facial recognition-enabled cameras outside a public housing complex to secure evidence against tenants. These technologies create an intricate system that can watch everything we do, wrongly send us to prison, deport our friends, note who visits, and track when we leave for work.
Despite its inaccuracy and dangerous consequences, facial recognition is becoming more common. Equifax launched a facial recognition product it markets to leasing offices. Socure sells a service that uses facial recognition and computer code to guess whether a person will pay off their purchases, and others are selling biometric verification to make lending decisions. ODIN, markets a facial recognition system it claims can identify people experiencing homelessness and provide personal information about them to the police. This information includes any outstanding arrest warrants, which often do nothing more than make housing more difficult to secure by criminalizing poverty; and allegations of past behavior, which may put armed officers on edge and make constructive outreach more challenging. ODIN claims its system can also use biometric identifiers and location tracking to check people into shelters remotely, but there is no reason such features are necessary for that task. As my wife’s verification process made clear, facial recognition doesn’t work the way it’s supposed to, and we can’t rely on it to make important decisions about housing, credit, or policing.
A few cities, states, and federal representatives are seeking to rectify these problems. The New York City Council recently reintroduced a proposal that would require landlords to offer tenants physical keys in place of biometric access systems; city advocates also hope to introduce more stringent measures in the future. Several cities and states have banned or regulated government use of facial recognition, and congress members have proposed prohibiting the use of facial recognition in public housing.
Our country, cities, and states need to enact these laws and more. Most rights and liberties were left unwritten in the U.S. Constitution, but freedoms relating to the home are different. The Constitution clearly protects privacy in property—indicating that surveillance technology does not belong in our homes. Much has changed since America’s founding; urbanization pushed us closer together, and technology connected us on a scale never possible before. But we never agreed to sacrifice our rights. We need new, more effective laws to ensure our rights remain intact.
Evan Enzer is a certified privacy professional and California attorney. He previously worked in housing case management and is currently a D.A.T.A. legal fellow with the Surveillance Technology Oversight Project, a New York-based civil rights and privacy group.