MIT Technology Review Subscribe

Chicago’s Experiment in Predictive Policing Isn’t Working

A new report suggests that a data-driven tool meant to reduce gun violence was ignored by police and, in a few cases, may have been misused.

Can technology be used to predict—and prevent—crime? In the case of Chicago’s recent attempt to prevent gun violence, the answer seems to be no.

A new report put together by Jessica Saunders and colleagues at the RAND Corporation examines how the Chicago Police Department implemented a predictive policing pilot project in 2013 and 2014. The city used a computer model to examine data on people with arrest records and come up with a list of a few hundred individuals deemed at elevated risk of being shot (or committing a shooting—the two groups have a striking amount of overlap). The idea was that police would be able to use the list to reach out to people and try to help them out of high-risk situations.

Advertisement

But the report, published in the Journal of Experimental Criminology, suggests two big problems with the program.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

First, the researchers found that in over two-thirds of cases, police throughout the city simply ignored the list (the formal name of which is the Strategic Subjects List, or SSL). They write:

Overall, the observations and interview respondents indicate there was no practical direction about what to do with individuals on the SSL, little executive or administrative attention paid to the pilot, and little to no follow-up with district commanders.

This suggests that the department’s rank and file were left perplexed by the data in front of them—probably because, as the Verge points out, no fewer than 11 other violence reduction programs were in play at the time. As a result, officers went about the business of everyday police work. And when no one from upper management looked in to see if anyone was using the system’s recommendations, they fell by the wayside.

When police did attempt to act on the list, the results weren’t very inspiring. The sample size was small—officers used the list to make just nine arrests. But the researchers found that people on the list were nearly three times as likely to be arrested for a shooting as those who didn’t get flagged by the system:

The finding that the list had a direct effect on arrest, rather than victimization, raises privacy and civil rights considerations that must be carefully considered, especially for predictions that are targeted at vulnerable groups at high risk of victimization.

The picture this study paints is not of a technology that is about to revolutionize crime-fighting or turn the tide of gun violence in Chicago. It’s of a highly imperfect initiative that, when first implemented, confused officers more than it helped them.

Gun violence is a huge problem in Chicago, and it’s understandable that city officials would want to bring technological tools to bear on the problem. But despite our obsession with data, it isn’t a solution. It is, at best, just another tool—one that needs to be handled with extreme care, especially when lives are at stake.

Advertisement

Read more: (BoingBoing, The Verge, New York Times, “Data-Toting Cops,” “The Problem with Our Data Obsession”)

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement