A Big Report Proving The Justice System Is ‘Racist’ Is All Nonsense

A major article widely cited as proof of racial bias in the criminal justice system is actually anything but.

According to ProPublica’s May article “Machine Bias,” a popular risk assessment software program, used across the country to evaluate criminal defendants’ risk of reoffending, has extremely biased outcomes against black defendants compared to white ones.

The article’s findings have started to take off. Immediately after publication, it was cited on sites like The Daily Dot and Fusion. On June 25, The New York Times cited it as proof that both computer algorithms and law enforcement are substantially biased against blacks. Last week, it was touted by FiveThirtyEight’s science reporter.

But ProPublica’s conclusions are deeply flawed, representing a major error in statistical reasoning on the part of its staff.



1. How ProPublica could find bias

ProPublica’s investigation focused on the COMPAS tool from Northpointe, Inc., which predicts a criminal defendant’s likelihood of reoffending after release based on inputs for over 100 different variables (race is explicitly excluded). COMPAS rates defendants on a 1-10 scale, with 1 being the lowest risk and 10 being the highest. COMPAS is used nationwide in pretrial, sentencing, and parole hearings to assist in a variety of judicial decisions.

If COMPAS genuinely had a racial bias, it wouldn’t be too difficult to find evidence. Functionally, it’s just a risk assessment tool, taking the details of criminal defendants and then producing a rating from 1-10 concerning their chances of reoffending. In a phone call, Northpointe co-founder and chief scientist Tim Brennan said a risk rating of 1 represents a 1-2 percent chance of reoffending, while a rating of 10 represented a risk approaching 80 percent.

Finding bias in COMPAS, then, is straightforward. For COMPAS to be a valid and fair test, it must do two things. First, it must predict recidivism with a useful level of accuracy, and second, it must handle different races equally in assessing their risk level. If the program were unfair toward members of a particular race, then members of that race would receive unjustifiably high risk ratings. A biased COMPAS, for instance, might rate a black male defendant as a 4 when he should be a 2, as a 7 when he should be a 5, or as a 10 when he should be an 8.

If COMPAS were showing this kind of bias, it would become apparent based on the recidivism of certain groups. For instance, if blacks rated an 8 reoffended 60 percent of the time, while whites rated an 8 reoffended 75 percent of the time, it would indicate blacks are wrongly receiving substantially higher scores than they should, and are therefore wrongly being treated as more of a crime risk than they really are.

Read More Here