Idaho is one step closer to becoming the first state to make pretrial risk assessment algorithms more transparent and open to public scrutiny, addressing long festering concerns about the risk to civil rights of automated pretrial tools in criminal justice.
House Bill No. 118, approved by the Idaho House on March 4th, requires that “all pretrial risk assessment tools shall be transparent” and all records of their use be “open to public inspection, auditing, and testing.”
It also invalidates the frequently used “trade secret” exemption that has made it more difficult for anyone challenging the algorithms to get details of how they work. The bill now goes to the state Senate.
“[T]he bill … really focuses on the transparency piece,” Representative Greg Chaney (R-10), the bill’s chief sponsor, told MuckRock. Chaney said he was “shocked and appalled” by risk assessments, similar to the 2002 film “Minority Report,” based on a Philip K. Dick story, in which police use technology to predict future crimes. “If there is something in there that either explicitly or implicitly brings in a bias, at the very minimum, we need to make sure that that’s always open to scrutiny, to litigation, to let the statisticians look and identify problems and let the public know what’s going on and to allow the process to work that way.”
Pretrial risk assessments estimate the likelihood that an arrested individual, if released, will fail to appear for a court date or will commit another crime. The assessments generate scores using a variety of metrics, such as criminal history. Judges are then able, but not obligated, to use these ratings when setting bail.
Some advocates for bail reform say the assessments can help reduce the prison population. Nationally, a quarter of people imprisoned, or about 615,000, are pretrial detainees in local jails. Of those, approximately 415,000 have not been convicted previously of a crime, according to the research group Prison Policy Initiative. According to advocates such as the Pretrial Justice Institute, assessment tools can be used to find alternative support services besides jailing someone, including aid for people suffering from mental health issues or substance abuse, They also can be used to encourage judges to release someone on personal recognizance without bail, if the assessment tool suggests that person will show up for the next court date.
The experience of New Jersey points up the potential benefits of the tools. The state saw its pretrial prison population decrease in 2017 by 20 percent from 7173 to 5743 individuals in its first year of reducing the use of cash bail. The state relied on the Arnold Foundation’s Public Safety Assessment tool, which was developed and has been actively promoted by the privately-funded organization. The tool, which considers nine different data points in its decision making and has been validated against local data where it’s used, has been adopted by dozens of jurisdictions, making it one of the most common pretrial evaluation tools in use.
Know of an algorithmic development near you? Click here to let us know.
Amber Widgery, Senior Policy Specialist at the National Conference of State Legislatures, told MuckRock that so far states have not discouraged the use of risk assessment tools but have tried to encourage best practices.
“I’ve never seen state legislation moving against risk assessment. I think it’s generally the broad picture across the states that risk assessment is a good thing as long as we’re adopting current best practices and the most current research based on this issue.” The state legislators organization has partnered with the Arnold Foundation to track pretrial policy at the state level.
However, civil rights groups remain concerned that computational models, like those used in the risk assessments, have a substantial downside and could automate and reinforce prejudices against groups that have long been subject to discrimination. Last year, more than 100 non-profit organizations, including the American Civil Liberties Union and the National Association for the Advancement of Colored People, signed a letter opposing any jurisdiction’s use of pretrial risk assessments. As part of that statement, the groups recommended that “[A] pretrial risk assessment instrument must be developed with community input, revalidated regularly by independent data scientists with that input in mind, and subjected to regular, meaningful oversight by the community.”
An earlier version of the Idaho House bill prohibited the use of any risk assessment tool not found in validation studies to be “free of bias.”
Chaney said the House Judiciary Committee struck that language after becoming “bogged down” in discussion of the technical requirements for validating bias or lack of bias. Though no longer explicitly required, the measure requires public access for validation studies if anyone uses them.
Chaney estimated that 33 of the 41 counties in Idaho currently use a form of pretrial risk assessment but with little transparency. None, he said, use Northpointe’s COMPAS assessment, a proprietary tool which a 2016 ProPublica investigation said inflated risk for African-Americans and underestimated Caucasian risk rates.
Northpointe was subsequently allowed to keep its algorithm private under trade secret rules during the Wisconsin case State vs. Loomis, in which the defendant argued that an algorithm that could not be viewed publicly violated the right to due process. However, the Wisconsin Supreme Court ruled that cases containing a COMPAS report should include a “written advisement listing the limitations,” among them the need for the software to be tested for accuracy using data from local populations. The U.S. Supreme Court refused to hear an appeal of the decision, allowing the Wisconsin high court ruling to stand.
Chaney is adamant that the public needs substantial and detailed information about pre-trial release algorithms and the tools that use them.
“Any pretrial release tool used in assessing risk must be absolutely open to public records request to understand the methodology, the criteria, the weight, the scoring,” Chaney said. “That has to be absolutely open to the public.”
The current version of HB 118 is embedded below.
Algorithmic Control by MuckRock Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Based on a work at https://www.muckrock.com/project/algorithmic-control-automated-decisionmaking-in-americas-cities-84/.
Image via Idaho Legislature