By Sarah Roth
As our institutions fail to adequately address the growing mental health and substance addiction crises, techno-solutionist snake oils are on the rise. A slew of “mental fitness” self-monitoring apps, data analytics, location tracking, and affective computing “solutions” companies are selling people dodgy surveillance when what they really need is healthcare. The collaboration between Pretaa, a private behavior analytics company, and Fitbit, the wearables subsidiary of Google, on a data-driven “lifeline” for in-remission drug addiction patients is just the latest enterprise to cash in on this dangerous trend.
It is easy to forget that such devices, marketed to us as customized, opt-in services, operate via mass surveillance that is both antithetical to their self-proclaimed ethos of consent and harmful to their users. For example, the Pretaa Fitbit live-monitors and records a wearer’s heart rate, notifying a clinician whenever it detects a possible relapse. This is called biometric monitoring, and it is ripe for misinterpretation and abuse since it functions by comparing wearers’ expressions (such as tone of voice, facial movements, or heart rate) to neurotypical and Eurocentric baselines. In other applications, this approach appears to alter outcomes for users with disabilities, neurodivergences, and cultural differences, whose expressions might be misdiagnosed or flagged as unhealthy by such automated systems.
In the case of the Pretaa Fitbit, the wearable could misread symptoms of other common medical conditions that elevate heart rate, such as panic disorder, or drug use. This especially threatens to criminalize and harm BIPOC wearers with one or more other health conditions or external stressors, such as residing in over-policed neighborhoods. These communities already experience higher rates of mental and physical health struggles due to structural violence and the stress of police violence. And because crisis responses in these areas are frequently accompanied by police presence, an alarm raised by the Pretaa Fitbit may have criminal or even lethal repercussions for both the individual wearer and their community. Moreover, due to a lack of strong regulations on health data-collection, the biometric information recorded by devices like the Pretaa Fitbit can potentially be accessed by law enforcement agencies and used to deny parole, increase sentences, or raise bail.
The Pretaa Fitbit and many other surveillance-as-healthcare tools are also equipped with location monitoring: a feature that is not only abused by discriminatory policing practices, but also a result of discriminatory policing practices. The feature operates by tracking the location of the wearable, and flagging a clinician whenever it has entered a “danger zone”: areas determined by law enforcement to have high rates of criminal activity through an inaccurate and racially biased policing practice called geofencing. Geofence warrants function as a digital broken windows policing tool – allowing law enforcement to conduct invasive digital searches of large groups of people, often in over-policed and majority BIPOC populated areas. Therefore, this device could become an implicit assault on community relations, leveraging a wearer’s health in ways that may violate civil and constitutional rights.
It is clear that these so-called “lifeline” tools, unable to reliably identify and eradicate the forces driving self-harming behavior, are merely the tech-gilded restraints of a growing virtual asylum that is automating treatment and ignoring disability justice advocates’ calls for sustainable, patient-led mental health care. The creators of the wearable stress that by giving users the option to opt-in or out of these restraints – selecting the data they want to share with caregivers and family – the device prioritizes user consent and control. Yet, Pretaa and Fitbit neglect to advertise that the sensitive information users may choose to share with the platform can make them the unwitting prey of harmful public health policies and trap them in a crudely under-resourced cycle of coercive care. This is the reality of mental healthcare for thousands of Americans with neurodivergences and disabilities, particularly those experiencing intersecting forms of oppression such as homelessness. Behavioral monitoring is rarely a choice for these individuals, who have long been coercively stripped of their decision-making authority, dehumanized, and involuntarily subjected to treatment that they are often unable to access voluntarily.
In New York, Mayor Eric Adams’ mandate to loosen criteria for the involuntary hospitalization of individuals that NYPD officers deem mentally ill is a reiteration of the legacy of these flagrant human right abuses and the criminalization of homelessness and those suffering mental health struggles. Non-dangerous individuals are being forcefully subjected to a barrage of paternalistic “care” based on mere speculation over their future conduct. Not only has this invasive monitoring and unnecessary crisis responder and police intervention proved ineffective at reducing harm, it also leads to incarceration. Constant surveillance and collection of an individual’s biometric and location data will sharpen the tools of this mental healthcare-to-prison pipeline.
In practice, the Pretaa Fitbit is no innovation – it is a glorified ankle monitor that may insidiously enable police surveillance, updating medicalized racism for the digital age by further intertwining healthcare with a discriminatory carceral system. We do not need more data on where substance use disorder patients are using. Rather than more stigmatization and surveillance, governments and healthcare providers should be offering more accessible and evidence-based treatment, and access to environmental and financial support.
Ultimately, if Pretaa FitBit and its ilk are the future of mental health care, then we are headed towards a future where the safety and security of our health is dangerously dependent on the forfeiture of privacy and freedom, especially for BIPOC, neurodivergent, and disabled communities.
Sarah Roth is a Development and Communications AmeriCorps Fellow at the Surveillance Technology Oversight Project (S.T.O.P.), a recent graduate of Vassar College, and prospective J.D. candidate.