Facial recognition on Tinder is stalkerware

By Lyla Renwick-Archibold and Owen May

Facial recognition has moved from the military to police departments, and now to our bedrooms, with one startup pitching the invasive and biased technology as a solution to partner infidelity. OopsBusted advertises its service as a way to bring transparency into modern dating by leveraging artificial intelligence—including facial recognition—to discover if partners are active on popular dating apps.

OopsBusted is simple to use. For a small fee, users can upload an image of whoever they choose along with their name, sex, age, and location. Using this information, OopsBusted scans Tinder, Bumble, and Hinge feeds in the specified area using facial recognition technology and provides the user with matching profiles.

Tinder, Bumble, and Hinge all prohibit scrapers and data miners such as OopsBusted from using data on their platforms in their Terms of Use agreements. But the dangers posed by services such as OopsBusted are far more insidious than end-user license agreement violations: they enable intimate partner violence and normalize invasive surveillance practices.

OopsBusted takes no substantial measures to protect against abuse of its service. While advertised to suspicious partners, stalkers, abusers, or even strangers could just as easily use the service to monitor the relationship activity of a victim without their knowledge or consent. The lack of precaution against abuse echoes privacy threats posed by the launch of Apple’s AirTag, which allowed for discrete location tracking enabled by the near ubiquity of Apple products.

Services like OopsBusted should also raise concerns about false identification, particularly for groups underrepresented in the data their algorithms train on. Facial recognition technologies have repeatedly demonstrated poor accuracy for darker skinned people, and in particular for darker skinned women. The case of Porscha Woodruff, a Detroit woman arrested on charges of robbery and carjacking while eight months pregnant based on a false facial recognition match, is just one of many examples of this technology falsely identifying people of color at a disproportionate rate.

The poor accuracy of facial recognition algorithms for people of darker skin tones creates a risk that algorithmic bias could result in misidentification, suggesting someone is active on a dating app when they are not, fueling existing violence with false information. OopsBusted has failed to responsibly address the potential limitations and risks of this technology.

As seen in the case of Tinder’s half-baked background check, unfettered access to surveillance technology can enable and perpetuate violence. While this tool does not allow users to access someone’s criminal records or financial information, allowing a jealous partner to believe their partner is on a dating app because of the service may lead to intimate partner violence. Jealousy is a known risk factor for intimate partner violence, and Black women are at a greater chance of being a victim of domestic violence.

Aside from the real threats posed by such technologies to already vulnerable populations, the spread of this technology to consumer markets represents a worrying cultural shift in perceptions of privacy invasion. The use of facial recognition and AI-enabled surveillance in non-essential applications weakens the barriers in public opinion toward the same technology being abused by companies and governments.

The inherent risks of collecting large amounts of sensitive user information anywhere have been demonstrated repeatedly by data breaches and the recent FTC crackdown on data brokers. Without robust privacy safeguards, these services pose a serious threat to consumer privacy and safety.

At first glance, OopsBusted and services like it may appear harmless—a way for concerned partners to ensure fidelity using nifty new technology. But by using biometric surveillance technology where it is not needed, these services deepen the disparities already present in the technology and make light of the dangers such technologies can pose. If you worry that your partner may be disloyal, a solution has existed for longer and cheaper than sketchy facial recognition: just talk to them.

Renwick-Archibold is a Research Intern at the Surveillance Technology Oversight Project (S.T.O.P.) and a senior at Washington University in St. Louis studying computer science and brain sciences. May is a Research Intern at S.T.O.P. and a junior at the University of Pittsburgh studying Political Science, Law, and Computer Science.

ResearchLeticia Murillo