Data aren’t just numbers and the algorithms that use them are not infallible math, impervious to bias and influence. Like all human endeavors, technology is at its core still social, argues Sarah Brayne in her new book Predict and Surveil: Data, Discretion, and the Future of Policing.
“Decisions on what data to collect, on whom, and for what purpose are all made by humans embedded in social, organizational and institutional contexts rife with preexisting priorities, preferences, imperatives and constraints.” says Brayne, an assistant professor of sociology at The University of Texas at Austin.
As police departments across the United States face increasing criticism for unequal treatment of the citizens they are charged to protect, Brayne wants us to consider an often overlooked driver of these problems — the rise of big data surveillance and predictive analytics in law enforcement.
The use of big data by U.S. police has grown significantly in recent decades but there is surprisingly little information on how such tools are employed, their efficacy and possible negative effects on the communities where they are most prevalent. So, Brayne embarked on five years of field work on the subject. Focusing on the Los Angeles Police Department (LAPD) because they were among the most technologically sophisticated units in the country, she conducted interviews with police officers as well as civilian employees, shadowed data analysts and went on ride-alongs to get a better sense of how technology has (and hasn’t) changed police work.
Old Tricks, New Tech
Contrary to the rhetoric of Silicon Valley, new technology is not always a disruptive force that overthrows previous ways of thinking. Much of the big data tool box used by modern cops builds on methods established during the “reform era” of the early 20th century. In an effort to disentangle policing from the corruption and political influence that were hallmarks of this period, reformers sought to make law enforcement more objective by adopting a data-driven approach. Perhaps the most well-known innovation of the time is the pin map in which pushpins marked the locations of crimes to search for patterns and predict future criminal activity.
Today’s police force continues to use location-based systems and other forensic techniques from the reform era, but the data that feed into them have changed considerably. Our technology enhanced daily lives generate countless data points revealing where we go, whom we interact with, what we purchase and what we search for online. And the private companies who provide these technologies are increasingly incentivized to sell data to everyone from advertisers to police departments. People with no police contact may now unwittingly find themselves in police databases.
Widening the Net
One striking example of a surveillance technology that pulls in people with no police record is the automatic license plate reader (ALPR). ALPRs are a type of “dragnet” surveillance, in that they gather data on everyone in a vicinity rather than just those under suspicion, much like a fishing net scoops up all manner of aquatic life in order to catch the ones that will be included on restaurant menus. ALPRs record each car that passes them, taking one photo of the license plate and another of the full vehicle, while also logging time, date and location coordinates. These can be used to identify people whose travel patterns make them likely crime suspects.
This widening of the net might sound more equitable precisely because it includes everyone rather than just individuals already in the system or those likely to be profiled by police. However, once the data are gathered, human police officers still have to interpret them in a way that is accurate and free of biased assumptions, which can be a tall order. A middle-aged, high income white woman whose car was near a series of robberies around the time they occurred might be viewed quite differently than a young Black man of lesser means.
“The appearance of objectivity is where the danger lies,” says Brayne. “If we assume that the data will always speak for itself and is not filtered through these very human prisms, that’s where we can fall into this trap of techno-utopia.”
This false sense of objectivity is a recurring theme in Brayne’s research. Both in how they are created and how they are deployed, she likens the algorithms of big data policing to a Trojan horse that smuggles in hoards of bias under a facade of impartial math.
If you’re anything like me, your knowledge of police work is largely informed by TV shows and true crime podcasts, which tend to focus on idiosyncratic crimes carried out by clever sociopaths and ultimately solved by ever cleverer detectives. As a result, some of the technologies described in Brayne’s book will initially seem like a capital idea, the kind of thing that could have helped the nice ladies on Unbelievable apprehend a serial rapist sooner and with far less legwork.
The reality of policing is, of course, much more boring. Investigation is the smallest arm of law enforcement. The largest is patrol, where cops are assigned a geographical area to monitor in the hopes of preventing crimes and quickly responding to problems. Because police can’t be everywhere at once, data is used to decide which neighborhoods and also which individuals patrol officers should focus on, both of which can reinforce longstanding inequalities.
“Throughout history, minority communities have been simultaneously overpoliced and underserved.” writes Brayne.
Using big data to choose where to send officers can create a pernicious positive feedback loop. An algorithm that draws on historical data of where the largest numbers of arrests occurred will recommend increasing patrol in those locations. This will lead to greater scrutiny of what are often minority neighborhoods and thus more arrests within those communities. Meanwhile suspicious activity in historically less patrolled areas is likely to go unnoticed. Algorithms purporting to pinpoint areas with the highest crime rates may actually just be revealing those with the most police presence.
Even more troubling than the kind of place-based predictive policing described above is the use of person-based technology. In her book, Brayne details the LAPD’s adoption of so-called “Chronic Offender Bulletins” as part of its Operation LASER (Los Angeles’ Strategic Extraction and Restoration Program) predictive policing initiative. Chronic offenders are designated using a point system that tracks factors believed to be high risk for future criminal behavior (five points for known gang affiliation, five points for prior arrests with a hand gun, as so on). But individuals are also assigned one point for every “police contact.” This means that every time a police officer stops someone and fills out a field interview (FI) card detailing the encounter an additional point is added to their record. Every day, LAPD databases generate lists of the twelve “worst” chronic offenders in each division, who are then more likely to be approached by police, resulting in the accumulation of more points, and a greater likelihood of being included in a future bulletin.
It’s easy to see how such technologies could discourage or thwart people who are trying to get on with their lives and stay out of trouble.
“The criminal justice system is kind of like the Hotel California; you can check out any time you like but you can never leave,” says Brayne. “Even if you are no longer on parole or probation or you’ve served your sentence. We’re not all equal under the law.”
Does Data-driven Policing Work?
Toward the end of Predict and Surveil we learn that a 2019 report by the Office of the Inspector General (OIG) analyzing the costs and benefits of Operation LASER and PredPol (a predictive analytics platform used by LAPD and other U.S. police departments) failed to find strong evidence that the programs had reduced crime. The OIG report also noted significant concerns about inconsistent enforcement and a need for greater transparency and oversight. LAPD has since discontinued Operation LASER and pivoted from predictive policing to something called “precision policing,” which, while perhaps less immediately evoking images from Minority Report may be little more than an attempt to rebrand wildly unpopular practices.
Amidst all the concerns of bias and civil liberties violations surrounding various policing practices, it’s easy to forget to ask questions about efficacy. This is especially true of big data-driven policing, whose veneer of objectivity implies that some kind of scientific proof underlies its use. Yet answering the seemingly simple question of “does it work?” is not so easy.
“What still baffles me is the lack of data on data-driven policing,” says Brayne. “Despite all of the rhetoric around data-driven policing, when a police department decides to adopt a new tool or platform, at least at the time when I started the fieldwork, there is usually zero evidence of its efficacy.”
Part of the problem, she explains, is that the role of private sector industries in creating the technology makes it difficult for civilians to access information needed to make such an assessment. Independent research can’t be done when only police departments and software developers have access to the data inputs and outputs of these systems.
But there are also normative questions involved. How should the success or failure of big data policing techniques be measured? To police departments, rearrests of chronic offenders might be seen as proof that the tools are working. Whereas to community members, the same metrics look like harassment with no improvement to public safety. And even if there was unanimous agreement that more arrests equal success, the question of how much cost we are willing to incur to achieve this would still be up for debate. If police stop someone on a Chronic Offenders Bulletin dozens of times before they catch them doing anything illegal, is this really an improvement in efficiency?
The Future of Policing
Data and surveillance technology used by police, Brayne argues, needs to be part of the discussion on how to reform, restructure or dismantle existing law enforcement agencies. But she also notes that technology is neither the cause of nor the solution to all of society’s problems. Technology may reinforce or exacerbate inequalities, but it does not create them.
Nor is widespread surveillance the inevitable result of technological advancement. Brayne points out that groups with political, social or economic power are able to evade surveillance in a number of ways. For example, due to National Rifle Association lobbying, the U.S. has no federal gun registry. And police themselves, despite being surrounded by new technologies, had fought to limit those they feel infringe on their own privacy. Brayne recalls being surprised on her first ride-along to see an officer manually entering his location on a laptop. Surely, the technologically advanced LAPD had a way of tracking its own cars? Each one was in fact equipped with an automatic vehicle locator, she learned, all of them turned off due to resistance from the police union.
And while some police departments have used technology to identify officers likely to engage in the use of excessive force and other behaviors that could result in legal and financial repercussions for their departments, enlisting big data to police the police is at best a partial solution to the larger problem of over-policing and institutional bias.
“We need to be really wary of the idea of ‘techno-benevolence’ and technological fixes to social problems,” Brayne cautions.
The use of big data in law enforcement, as well as other parts of the criminal justice system, has the potential to reduce bias and make these systems more equitable, says Brayne, but in order to do so we need to acknowledge the unavoidably social nature of data and thoroughly scrutinize technological platforms to ensure that they don’t replicate and amplify the very biases they claim to eliminate. And this, of course, requires greater transparency.
“If the ownership of and access to the data continues to lie only in the hands of the police,” says Brayne. “Then I don’t think any meaningful change will occur.”