Predictive Policing Software Shown to Entrench Bias, not Address It

Predictive Policing Software Shown to Entrench Bias, not Address It

Gizmodo’s analysis is the first independent examination of PredPol, the first and most commonly used predictive policing software. At its core, PredPol—and other predictive policing software—works by analyzing past crime and patrol data to predict future crime patterns, giving agencies an idea of where to direct their resources. By coming across an unsecured cloud storage space that housed PredPol documents and data, the publication was able to capture 7.8 million individual predictions for 70 different jurisdictions across the United States. Gizmodo narrowed its focus down to 38 city and county agencies with at least six months worth of data, then confirmed those agencies did in fact use PredPol before beginning its analysis. The blocks identified within those agencies’ jurisdictions were then broken down by demographic using the 2018 American Community Survey.

The full study is several pages long, but it confirms what many of us have long suspected: predictive policing software targets lower-income communities and communities of color while largely leaving higher-income and white communities alone.

When it came down to race and ethnicity data, Gizmodo’s research found that “the most-targeted block groups had a higher Black or Latino population while block groups that were never or infrequently targeted tended to have a higher White population.” While Asian residents weren’t targeted as heavily, they were still targeted more than white residents and communities were, with Asian populations in the most-targeted block groups exceeding the jurisdiction’s median Asian population in about a third of the blocks studied.

“The data suggests that as the number of predictions in a block group increases, the Black and Latino proportion of the population increases and the White and Asian proportion of the population decreases,” the study reads.

Predictive Policing Software Shown to Entrench Bias, not Address It

In a majority of the jurisdictions in Gizmodo’s dataset, the blocks most heavily targeted also contained a higher proportion of lower-income households ($45,000 per year or less—the publication deemed the federal poverty line of $26,600 far too low to accurately capture the number of households struggling financially).

How does this happen? PredPol’s algorithm doesn’t directly incorporate race data, but it does base its output on past policing patterns. If racial bias has been a factor in past policing patterns, it will inevitably be baked into the predictions PredPol makes.

And race has been a factor in American police practices from the beginning. Communities of color have been subject to disproportionate levels of monitoring and control over their movement, both historically and in the present day, even when crime rates are not actually higher in the areas where they live. PredPol denies its software works this way, but the proof is in its own results.

One study published in 2018 (led by one of PredPol’s own founders) found that different algorithm implementations led to very different outcomes. The default predictive algorithm would have resulted in Latino populations receiving “200-400 percent the amount of patrol as white populations” when applied to data sourced from Indianapolis from 2012-2013.

This study found that the algorithm could be tweaked to distribute patrols more fairly across the city, but that this had some impact on accuracy, or successfully sending patrols to areas in which crimes were actually committed. Overall accuracy under the tweaked “fairness” model is still claimed to be higher than human intervention, which may say something about humans’ own bias in policing. (Sure enough, so-called “broken-windows” policing, which targets areas with visible signs of crime—and therefore targets lower-income areas—is a human-made practice that has been criticized for its baked-in bias for years.)”

That’s the problem, though: if an algorithm is going to make predictions based on prior data, that data should be free of bias to begin with. This is extremely difficult to do with any dataset but especially difficult in the realm of law enforcement, given everything we’ve learned about preconceptions in policing over the last year or so. Simply put, dirty data equals dirty software.

After reviewing Gizmodo’s analysis, PredPol said it was based on “erroneous” and “incomplete” data, despite confirming that the reports the analysis was based on were in fact generated by PredPol. Gizmodo says the company did not justify its claims.

Some law enforcement agencies have refused to use or stopped using predictive policing software on the basis of uselessness or redundancy (which can indicate a good sign or a bad one, depending on how you look at it). Others informed Gizmodo that they use the software as a starting point, not a tell-all. But one thing is for sure: if law enforcement intends on resolving its storied history with racial and economic bias, PredPol might not be an amazing place to start.

Continue reading

NASA’s Mars Helicopter Remains Grounded Awaiting Software Fix
NASA’s Mars Helicopter Remains Grounded Awaiting Software Fix

NASA previously said the Ingenuity helicopter would take to the Martian skies over the weekend, but the agency announced late Friday that liftoff was delayed until at least April 14 because of a software issue.

Software Bug Delays Ingenuity Helicopter’s 4th Mars Flight
Software Bug Delays Ingenuity Helicopter’s 4th Mars Flight

This appears to be the same issue that caused the delay in Ingenuity's first flight timeline. NASA says it's planning to try this one again today, and we should know in a few hours whether or not it was successful.

OnePlus Confirms No OnePlus 9T in 2021, New Android Software Coming
OnePlus Confirms No OnePlus 9T in 2021, New Android Software Coming

OnePlus has announced that it's not doing a T-series update to the OnePlus 9. The next time OnePlus releases a phone, it'll be running a new version of Android that replaces the current Oxygen OS.

VW Software Conveniently Helps Drivers Cheat Emissions Tests, Again
VW Software Conveniently Helps Drivers Cheat Emissions Tests, Again

The software tricks emissions tests into thinking the vehicle is coughing out as little as one-fifteenth the amount of nitrogen oxide it really is.