Using Algorithms to Sentence People?

When it comes to solving difficult problems, technology can be of great use. Everybody knows how useful technologies such as calculators and computers can be if you are trying to find a solution in the math and science fields. For the most part, these types of calculations merely provide us with some basic information, deduced from a variety of parameters, that we can then use to continue an experiment or prove that we found something. While I am not an expert in computer science, I am well aware of the utility of algorithms to everyday life and to complex issues in the scientific and digital worlds. But how far does their utility extend?

In Algorithms of Oppression, I was struck by the mention of a sentencing software called Compas developed by a company once known as Northpointe (stated incorrectly as the name of the software in the reading on page 27) and now known as Equivant. This software takes into account the answers to a 137-item questionnare that covers most aspects of a defendant’s life and then spits out a number that correlates to the defendant’s risk of re-offending. Judges can then choose to take this information in account during sentencing.

Above is a screenshot taken from Equivant’s website. After selecting “Judicial Officer” from a variety of roles listed on the site, I was directed to this page which tries to convince me that the algorithm will make sentencing less of a burden.

One of the problems I have this system is that defendants and their lawyers are not allowed to see the information that is put into the algorithm nor its result. Judges themselves do not even have access to the algorithm (presumably for trade-secret purposes). Even more interesting, the algorithm has only a 65% success rate at correctly identifying repeat offenders. How can such a process be legal? Because Judges are supposed to use this as one of many factors in determining a sentence. In Algorithms of Oppression, the author spoke a great deal about the racism and sexism unknowningly imbedded in many search engine algorithms. This truth extends to Compas as well as a study analyzing criminal information from Broward County, Florida showed that the algorithm is biased against African Americans. Such a revelation greatly complicates matters as it is certainly immoral to use such an algorithm to sentence anyone. So how does the justice system go forward? I would suggest that the algorithm be improved until it can more accurately predict recidivism without racial input. Sentencing, since it has historically involved a human making a decision, has always been tainted with prejudices. Now with technology, we have an opportunity to effectively eliminate or minimize them, thus creating a more fair process for all.

Leave a Reply

Your email address will not be published. Required fields are marked *