Pay No Attention To The Man Behind The Curtain

It’s an open secret that our court system isn’t impartial. It should be impartial, since Lady Justice wears a blindfold to show impartiality and that true justice does not take race, religion, gender, income, etc. into account when determining guilt or innocence. The concept of “equal justice under law” is still our ideal. But too often poor people and minorities get a raw deal when they come into contact with the justice system. Bail is set higher, they are more likely to be wrongfully convicted, and they get longer sentences.

So what’s to be done? Enter the all-powerful algorithm. In some jurisdictions courts are using a computer algorithm to determine whether or not someone should get bail, and if so, how much. Other jurisdictions are using computer programs that predict the risk of reoffending to guide sentencing decisions. Computers don’t make judgments based on skin color or income levels, right?

Well, not exactly. An investigation by Pro Publica of Broward County’s bail program revealed that the computer program used to set bail conditions was rating black people as “high risk” and white people as “low risk” arbitrarily, even if white people had longer rap sheets and black people were first-time offenders. This resulted in higher bail amounts for black people, and more black people sitting in jail awaiting trial or being coerced into accepting bad plea deals on the promise of getting out of jail sooner. From the report:

In forecasting who would re-offend, the algorithm made mistakes with black and white defendants at roughly the same rate but in very different ways. The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants. White defendants were mislabeled as low risk more often than black defendants. This is a logical extension of the pattern of what some pundits have called “algorithmic guilt.” We make an inherent assumption that computers and algorithms are impartial and don’t make mistakes, but without access to the underlying data and the actual computer program itself, how do we know they’re impartial? After all, a computer program is only as good as the data and the code it uses.

We’re fast approaching a time where computer algorithms make more and more decisions about everything, not just things like what google displays when you search for restaurants or what you might like o buy on Amazon. It’s conceivable that criminal justice systems could soon be using technology that isn’t all that far removed from phrenology.

Imagine a computer program “impartially” declaring that black faces display more “guilty characteristics” than white faces, and therefore black people’s bail should be set higher because they are at greater risk of committing a new crime while out on bail. Imagine a prosecutor or a judge throwing up their hands and saying “it wasn’t me – I’m not racist – it’s the computer program.”

When we give decision-making authority over to computer algorithms in our criminal justice system without truly paying attention to the man behind the curtain, we run the risk of putting people at the mercy of magic guilt-o-meters.