Idaho legislators approve law requiring transparency for risk assessment tools

Idaho legislators approve law requiring transparency for risk assessment tools

Governor Brad Little is expected to decide this week whether he will approve the measure

Written by
Edited by Miranda Spivack

Idaho legislators have approved the first bill in the nation to require expansive transparency about the use of risk assessment tools in the criminal justice system. The measure passed unanimously in the Idaho House on March 25th after winning approval in the state Senate. It now goes to the desk of Idaho Governor Brad Little (R), who has not said if he will sign it.

The measure is the first in the country to specifically address transparency in the use of computerized risk assessment tools to predict the likelihood of a repeat offense before a judge sets bail or a prisoner is released on parole. It is also the first to set requirements for public access to the design and data used to build the tool’s predictions, requiring that records created during a risk assessment tool’s development, validation, and use to “be open to public inspection, auditing, and testing.” The measure also provides defendants’ the right to “review all calculations and data used to calculate the defendant’s own risk score.”

A spokeswoman for Little declined to comment on the governor’s plans for the bill.

“Governor Little examines each piece of legislation thoughtfully and individually,” Marissa Morrison, a spokesperson for the governor, told MuckRock via email. “He will not issue statements on pending legislation during the deliberative process.”

However, the bill’s sponsor, State Representative Greg Chaney, a fellow Republican, told MuckRock that the governor’s office had expressed no reservations about the measure. Chaney anticipates a decision from the governor soon. The governor has seven days to sign or veto the measure once it is transmitted to his office.

Chaney became interested in the issue after hearing news reports describing the potential for bias in the risk assessment system.

“[T]he companies who make [the assessment tools] claim trade secrets, and they end up not being compelled to share their methodology or the contents of the algorithm. Essentially it is quite frightening that there’s this predictive model out there saying that you need a particular bail or sentence because of something you haven’t done yet and, by the way, you don’t get to know how they’re doing it. It’s almost too far out there to believe, but it’s happening, “ Chaney said.

Know of an algorithmic development near you? Click here to let us know.

Risk assessment tools in the criminal justice system increasingly have been scrutinized after academic and other studies have found the tools, which are based on algorithms, may assign risk scores, affecting an individual’s release and rehabilitation options, by using biased datasets.

Some advocates for policies to address mass incarceration, however, believe that risk assessment tools are among the few options available to reduce prison populations. In many jurisdictions, jails are crowded with pre-trial detainees unable to afford bond but who pose no flight risk. According to Chaney, in early discussions of the Idaho House Committee on Judiciary, Rules and Administration, opponents of the bill, which initially required any risk tool be “free of bias,” questioned whether tool-based reform efforts would be stymied by stricter anti-bias rules.

“If eliminating bias [is] too much of a restriction [on the tools’ use], then maybe that’s what needs to happen,” Chaney told MuckRock, saying he’d rather eliminate the tools than use a biased system.

“The transparency bill that it turned into became more of a way of making sure that there wasn’t anything that could be said specifically to unfairly keep them [the tools] from use,” Chaney said. “But if they are going to be used … widespread and with that kind of finality, they need to be where they can be challenged and [watchdogs] hopefully expose them. If those biases present themselves or when those biases present themselves,” he said, he hopes that there are people who will be able to “push the issues.”

The full bill is embedded below.

Creative Commons License
Algorithmic Control by MuckRock Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Based on a work at https://www.muckrock.com/project/algorithmic-control-automated-decisionmaking-in-americas-cities-84/.

Image by Frank Schulenburg via Wikimedia Commons and is licensed under CC BY-SA 4.0