Case study of LAPD and Palantir’s predictive policing tool: same corruption; new, empirical respectability

UT Austin sociologist Sarah Brayne spent 2.5 years conducting field research with the LAPD as they rolled out Predpol, a software tool that is supposed to direct police to places where crime is likely to occur, but which has been shown to send cops out to overpolice brown and poor people at the expense of actual crimefighting.

Brayne observed and interviewed more than 75 cops to get a picture of how the job of policing is changed by big data-based “predictive” tools. She found that the tools changed police from a law-enforcement agency to an intelligence agency, concerned more with surveilling people who had not committed a crime than to interdicting or solving crimes in the world.

The cops she interviewed were bullish on Palantir’s products, though they also candidly admitted that predictive tools allowed them to put an objective face on their existing, illegal racial profiling practices (“[Y]ou can’t target individuals especially for any race… [W]e didn’t want to make it look like we’re creating a gang depository of just gang affiliates or gang associates. . . . We were just trying to cover and make sure everything is right on the front end”).

Predictive policing casts a very wide net. Whereas before, the police would only assemble a file on you if you were suspected of a crime, the Palantirization of policing means that “police increasingly utilize data on individuals who have not had any police contact at all.” Tools like the Automatic License Plate Reader log the movements of everyone in a city; then, if a predictive policing algorithm fingers you as being somehow connected to a suspect, all your movements, going far back in time, are summoned up and delivered to the police (the same goes for other automated bulk-collection records, like cellphone surveillance through IMSI catchers and records requests to phone companies).

In Brayne’s words, it’s no longer the case that individuals engage in incriminating acts, now, “individuals lead incriminating lives—daily activities, now codified as data, can be marshaled as evidence ex post facto.”

What’s more, these tools are a ready made for “parallel construction…the process of building a separate evidentiary base for a criminal investigation to conceal how the investigation began, if it involved warrantless surveillance or other inadmissible evidence.” This means that any protections embedded in warrantless surveillance regimes (like the inadmissability of evidence) are easily circumvented by law enforcement.

Brayne paints a picture of law enforcement, Palantir and co working together to keep business-as-usual in place, but with a veneer of empiricism. A cop who “knows” that someone is guilty can cast ever-wider surveillance nets until he finds confirming evidence, then he can rebuild his case using sources that are admissible in court, railroading his chosen perp into prison with the appearance of mathematical objectivity, rather than the racial bias that resulted in the LAPD coming under a Department of Justice consent decree.

As Brayne says, “Characterizing predictive models as ‘just math,’ and fetishizing computation as an objective process, obscures the social side of algorithmic decision-making. Individuals’ interpretation of data occurs in preexisting institutional, legal, and social settings, and it is through that interpretive process that power dynamics come into play.”

Big Data Surveillance: The Case of Policing [Sarah Brayne/American Sociological Review] (Sci-Hub mirror)