Surveillance technologies create a dragnet to monitor nearly everyone and collect data on all of our behaviors.By Ed Vogel & Fletcher Nickerson , TRUTHOUTPublishedMay 21, 2023
Truthout is a vital news source and a living history of political struggle. If you think our work is valuable, support us with a donation of any size.
In late October 2022, a police officer in Wichita, Kansas, was arrested for stalking his estranged wife. The officer had been using the police department’s automated license plate reader technology (ALPR) to monitor her movements, without permission or authority to use the technology. The officer was removed from the force and charged for both stalking and the unlawful acts related to his use of the ALPRs.
In response, the vendor for the ALPRs, Flock Safety, visited Wichita to address the situation. Company representatives stated that this was the first occurrence of abuse that was reported to the company since its founding in 2017, and that “Flock is working to develop an alert system that flags suspicious requests but has not done so yet.” The officer’s use of the ALPR system had only been discovered after his estranged wife informed a friend, who then told the police that the officer was stalking her, as neither the department nor the company maintained any clear safeguards to prevent the technology from being used in this manner.
In many ways, the incident in Wichita is frighteningly typical; research shows that police may commit acts of domestic violence at nearly 15 times the rate of non-police. What is new and unusual is not the officer’s act of stalking, but how he did it: by tracking his wife’s car through the use of ALPRs.
|
Border Industry Peddles Robot Dogs and AI Surveillance Amid End of Title 42
Blocks from a migrant camp in El Paso, execs at the Border Security Expo hawked dystopian tech designed to repress them.By Candice Bernd , TRUTHOUT
May 16, 2023
License plate readers are pitched as a way to identify “criminals” who might be driving around undetected, especially amid a supposed (often unsubstantiated) rise in car-related crimes. However, as police across the United States increasingly deploy ALPRs to scan license plates and track the movement of thousands of cars every day, we must question their use as a “safety” measure. In fact, we must recognize the broader expansion of surveillance technologies for what it is: a force that makes everyone less safe by upholding white supremacy and other systems of oppression.
Law enforcement agencies, including Immigration and Customs Enforcement, use ALPRS to identify cars on a “hot list” — a list of license plate numbers deemed to be tied to cars and people involved in criminal acts — which enables police to move faster and more deliberately to arrest the drivers of those vehicles. The adoption of ALPRs has seeded a huge private industry where companies like Motorola Solutions, ELSAG and Flock Safety earn millions by deploying these cameras and collecting billions of data points annually.
ALPRs are just the tip of the iceberg as surveillance cameras, facial recognition technologies, and other products increasingly monitor people 24/7. The officer in Wichita using the surveillance technology at his disposal toward a violent end is not an anomaly. There are numerous other incidents where police have used databases to illegally identify people, abused facial recognition technology when trying to identify a suspect and have requested a surveillance tech provider change data leading to someone being charged with murder, among other incidents.
Plain and simple, surveillance technologies are not designed to create safer communities. Creating safer communities would require increased public funding and resources to address the root causes of poverty and other reasons people commit acts of violence. Instead, surveillance technologies create a dragnet to monitor nearly everyone and collect data on all of our behaviors.
However, while everyone is being tracked, not everyone is being pursued and criminalized; police choose to use surveillance technologies like ALPRs against specific, marginalized sets of people. Police make the subjective decision as to whom to monitor and how. Police then use these datasets to justify the ongoing criminalization of Black, Brown, queer, trans, immigrant, disabled and poor communities. Police then use the crime data they produce to call for more surveillance. The cycle repeats itself. Surveillance technologies intensify and exacerbate the criminalization of already heavily criminalized people.
In reality, the use of surveillance technologies actually makes communities less safe. Instead of interrupting violence, these technologies invite it. First, surveillance tools increase the presence of police in communities which have been decimated by mass incarceration. Second, surveillance technologies increase the potential and threat of police contact, which always contains the possibility of physical violence. Finally, by enabling more interaction between residents and police, these technologies open the door for more residents to be subjected to abusive practices like stop and frisk, arrest, jailing and potentially prison time, simply because the technology initiated a police deployment, even if no violence was committed.
All of this must also be considered against the fact that there is no conclusive evidence that any of these technologies have a clear impact on reducing harm or violence. In criminalized communities, surveillance technologies create the conditions for more violence by facilitating the purpose of policing — punishment, not safety.
The ubiquity of digital surveillance technology is the latest example of how the state maintains a monopoly on defining what is the legitimate use of force and violence through law, policy and practice. Elected officials, judges and police create, define and enforce the legal boundaries which mark when force and violence can be used in acceptable ways under the law. With little public oversight or awareness, surveillance technologies have quickly been ushered into the realm of what are considered acceptable tools for use by police, schools, and other public agencies.
In the abstract, proponents only speak to how these tools will be used to monitor people performing behaviors and actions which are currently deemed to be criminal. The rise of computing technologies birthed new methods of surveillance which police and other entities have implemented with little understanding of the full scope of the potential for invasive violence each tool has. Elected officials and law enforcement claim that these surveillance technologies are urgently necessary because they will help to curb violence and promote community safety. But we must ask, whose lives are made safer by these tools?
The news from Wichita illustrates another layer of the state’s calculation regarding the legitimate use of force: Elected officials who approve surveillance technology contracts value order more than safety. Surveillance technologies inherently lack controls and preventative oversight is nearly impossible. The only accountability measure within these surveillance tools is an audit log of police use after the technology is used. Police can employ these tools for their own designs and then, if caught, ask for forgiveness later. This only becomes an issue for officers if they overstep the line on what the state defines as acceptable and are caught doing so, as in the Wichita case. These tools are violent, no matter if the tools are used properly or improperly.
The lack of controls regarding the use of surveillance raises a question: When there is substantial evidence that police commit acts of domestic violence at higher rates than non-police, why is the state willfully giving them technologies which have the potential to enable more harm? On the surface, it does not appear that people experiencing domestic violence, for example, are included in the consideration of whose lives are worthy of being made safer.
Further, the scope for abuse and violence extends beyond the elements of the Wichita case. If a police officer estranged from his wife can use ALPR data to monitor her, what about the avowed white supremacist police officer? What might a D.C. Metropolitan Police Department officer, who is a member or sympathizer of the Proud Boys, do with access to facial recognition technology after he snaps a few photos at a Pride march or Black Lives Matter protest? What if the supervisor of that officer is also a member of the Proud Boys and simply overlooks the use of this powerful technology to obtain a protester’s identity and their home address?
At this point, this scenario may only be a hypothetical, but there are plenty of avowed white supremacists who work in law enforcement, and the January 6 attempted coup by white supremacists should confirm that this is a very real threat as an officer from the D.C. police actively shared intel with the Proud Boys leading up to the attempted coup. Whether performed or facilitated by police, the dogmatic and widespread adoption of invasive surveillance technologies could provide enough intelligence to enable a massive white supremacist attack, like the Greensboro massacre of 1979, or a massive coverup of patterns of racist violence, like Jon Burge’s tortuous regime as a Chicago Police Department commander.
What the state defines as a legitimate use of force is not stable and can shift. The most glaring example of how the standards of legitimate surveillance can quickly shift is the recent restrictions on access to abortion. Before the 2022 Supreme Court’s Dobbs decision, seeking abortion services was a constitutionally protected act. Now, in many states, it’s considered criminal behavior. ALPRs, which were used by the officer in Wichita, might well be used by officers to track people crossing state borders in search of abortion services. The Dobbs decision created the conditions for states to criminalize a broader set of people. In doing so, the mass surveillance tools already deployed became more potentially violent to a new set of people overnight. The anti-transgender legislation push currently sweeping the country is another example.
In addition to police and governments, another major player in the consideration of surveillance technologies are the companies peddling these products. Surveillance is profitable. Flock Safety alone has raised at least $380 million in venture capital funding since its founding in 2017 and has been valued at $3.5 billion. The profitability of the surveillance tools developed and sold by companies like Flock Safety, Motorola Solutions, Fog Data Science, Clearview AI and SoundThinking is an extended layer of the prison-industrial complex. The state not only has defined what is legal and acceptable violence but also has made that violence necessary for economic growth. And since capitalism and the prison-industrial complex have always been racialized, the violence of surveillance capitalism most heavily targets people of color, profiting off of harm inflicted on Black and Brown communities.
While the line on what is acceptable violence can change, what hasn’t changed is the imperative of the state to protect corporate profits, white supremacy, and other systems of oppression. This should provide a reminder that when the state promises safety, we need to ask: “Whose safety?”