How policing’s new digital tools paste a data-based veneer over old systemic problems of discrimination

The justification for the explosion of ‘predictive policing’ in data tools is that they eliminate bias, render decision-making objective, and end such practices as racial profiling. One University of Texas sociologist found just the opposite, that surveillance, in turn, reinforces stereotypes and bias

Use of ‘predictive policing’ in Los Angeles is discussed as a model for Texas on Dallas TV

If America is to learn how to police better, it actually must learn how to police less.

So argues author and University of Texas at Austin sociologist Sarah Brayne, who spent months embedded with the Los Angeles Police Department to examine the use of sweeping new data tools — which she concluded may be creating more problems than they solve.

Author and sociologist Sarah Brayne

“We are focusing on how we can reallocate police resources to police better,” Brayne argued in a Feb. 10 podcast interview with Urbanitus co-founder Jeremi Suri and poet Zachary Suri. “Let’s think more about how we can reallocate resources to police less.”

Brayne is an assistant professor of sociology at UT’s College of Communications. Historian Jeremi Suri is a professor in the university’s Department of History and the LBJ School of Public Affairs. They were joined for their discussion in the weekly podcast, This is Democracy, by poet Zachary Suri, an Austin high school student.

Brayne’s book, Predict and Surveil: Data, Discretion, and the Future of Policing, was published in November and has quickly become a widely-used reference in the debate that has swept America since the May killing of George Floyd by police ignited protests across the country. In Austin, where Floyd’s death was preceded just weeks earlier by the fatal shooting of unarmed Michael Ramos, the ensuing protests have prompted the City Council to cancel one class of the city’s police academy to review training and reorganize some department functions. And while other measures are being mulled in Austin, the future of the city’s police force has become a white hot topic in recent local elections.

Now, at the urging of Texas Gov. Greg Abbott, the Texas Legislature is considering police-focused legislation that could even take over management of the Austin Police Department by the state.

But it’s a debate, Brayne suggested in her wide-ranging discussion, that may well miss the larger context and reality of America’s deeply troubled policing at a moment where technology is driving police into a feedback loop of growing confrontation with some communities. Growing tensions, she said, are driven by data-based, “quantified, self-fulfilling prophecies.”

“All of these digital trails we leave in our lives, these have really exploded in the last 10 to 15 years. So where is this leading us?” asked Jeremi Suri.

Brayne’s answer was that the enthusiastic embrace by police departments of technology tools enabling so-called “predictive policing” wrongly assumes that data is somehow objective and bias-free.

“Data does not necessarily speak for itself,” she said. “It’s not just a mechanical reflection, or a mirror reflection of what’s going on.” Data, in fact, is social and carries all the complexity, bias and prejudice of those who create it.

Her new book has been lauded for revealing the collaboration between police departments, private data brokers, and technology companies in ways similar to the role big data actors increasingly play in marketing, health care, finance and other sectors. The police departments leading the nation’s drive toward data analytics are in New York, Chicago and Los Angeles, which is why, Brayne explained, she chose the LAPD for her research.

The common justification for the explosion in use of new data tools is that they will eliminate bias, render decision-making objective, and end such practices as racial profiling.

What Brayne found, however, was the opposite. The use of big data and algorithmic decision tools, in fact, locked in biased decision-making and wrapped it in a veneer of objectivity that results in massive surveillance. That surveillance, in turn, reinforces stereotypes and discrimination, she argued.

Noting a passage from her book, Brayne noted how a data analyst could narrow a search of 140 million records down to just 13 possible suspects. But when asked what would happen if the system produced a false positive, the data analyst simply replied, “I don’t know.”

She described how the use of data tools drives police to build “predictive risk profiles” of suspects with criteria that adds “points” with each police interaction, effectively making the predictive risk scoring on individuals grow as if on auto-pilot. Essentially, it’s a process of “tech washing” bad practices — sometime unintentionally — that makes them appear benign.

“If we don’t consider data as social, if we do consider it as sort of objective, unbiased, the ground truth, then what happens is that all of the social processes that shape data collection become obscured,” Brayne said. “It means that the patterns of the ways police make decisions become invisible.”

Which is not to say that Brayne advocates abandoning technology in the quest to create a more just society. Rather, it’s that there must be more tools and use of data to improve education, social welfare, health care systems and strengthen the institutions that can confront the conditions that create criminality in the first place.

 “Police reform,” Brayne said, “ironically, is going to have a lot to do with what we do outside of the police context.”

The podcast discussion also includes Zachary Suri’s reading of his poem “Does the Algorithm Understand Poetry?”

Click here to listen to the entire podcast, “Big Data and Policing,” the 135th episode of This is Democracy, produced by the UT Liberal Arts Development Studio at the College of Liberal Arts.

Buy this book with a click to your local bookseller:

If you like what you’ve been reading, please click here to subscribe and we will send you updates and our newsletter.


Please enter your comment!
Please enter your name here