Can technology help predict crime and flight risk?

Kahryn Riley
Mackinac Center for Public Policy

Law school professors teach that the legal system relies on judicial neutrality and binding precedent to ensure that cases get resolved objectively and consistently. But, they regularly caution, the reality is that sometimes an outcome can be influenced by what the judge ate for breakfast that morning.

In today’s increasingly data-driven world, that possibility is unacceptable, given that data can provide clarity and accountability. Many criminal courts, thankfully, are innovating in an effort to remove the last vestiges of partiality from the courtroom.

Although Michigan has adopted a set of sentencing guidelines that account for an offender’s criminal history and the severity of the crime, these two factors are not enough to guide decisions about whether to release an accused criminal who is awaiting trial.

This important decision has serious implications for the defendant’s ability to retain employment, housing and child custody, and at least five studies show that pretrial detention has an adverse impact on the outcome of the defendant’s case.

The Michigan Court Rules is a set of guidelines for judges that names a number of factors judges should consider when deciding whether to release a defendant before the trial starts. These include the person’s mental condition, financial status, employment status, character and reputation, and ties to the community. But the guidelines do not indicate what weight a judge should give to each or all of these factors. Some research suggests that they may not include all the factors that most reliably predict an individual’s risk of flight or new criminal behavior.

Thus, some criminal courts have begun to use risk assessment tools, from actuarial science, which ask for information most likely to predict an individual’s risk to society. The courts use their inputs to generate a score for each individual, indicating whether it is safe to release him or her.

These tools may sound like a magic formula, but media outlets, academics and others have criticized them sharply. One prominent risk assessment tool called COMPAS, developed by Northpointe, Inc. and used by the Michigan Department of Corrections, was analyzed by ProPublica in 2016 in an article that quickly received much attention. The news outlet claimed that the tool contained racial biases and produced inaccurate predictions. Follow-up research by other organizations revealed that the problem may be even worse than ProPublica alleged. The journal Science Advances examined risk assessment tools and found them to be no more accurate at predicting recidivism than untrained humans.

The goal of risk assessments — to accurately predict a given individual’s risk to society — is a laudable one. But the concerns about their accuracy and bias should temper courts’ reliance on their results. The irony is that these tools attempt to generate an individualized result by relying on statistical generalizations, and they attempt to provide predictability and accountability by relying on proprietary algorithms that are far from transparent.

The middle road in this case is the best path forward. Judges should consider new data and research. But some subjective human judgement is needed as well. These high-stakes decisions should never be left entirely to a statistical algorithm.

——————

Kahryn A. Riley manages the Mackinac Center for Public Policy's criminal justice reform initiative. A Michigan native, she studied in the honors programs at Hillsdale College and Regent University School of Law. She holds a BA in politics and a law degree, and is admitted to the State Bar of Michigan.  The Mackinac Center for Public Policy is a research and educational institute headquartered in Midland.