Greg Chaney |
Unfortunately, the more one looks at the use of algorithms, the more clear it becomes that such systems are utterly lacking in transparency. In examining the landscape surrounding risk assessment, an Orwellian nightmare emerges: We are told to trust the forces behind the movement, and their claims that the algorithm was built properly and works as promised.
An article titled "Life, Liberty and Trade Secrets"[1] appeared recently in the Stanford Law Review and revealed that private providers of algorithms have asserted trade secret protections in criminal cases to protect their intellectual property. It applied to those built by nonprofit, for-profit or even government agencies in an effort to prevent criminal defendants from being able to challenge the integrity of the models.
The author, Rebecca Wexler, a visiting fellow from Yale Law School, provided several relevant case studies, including one concerning a person assessed by an algorithm at sentencing in a death penalty case in Pennsylvania, who was not allowed to scrutinize the source code.
She noted that the examples cited are not isolated, and that assertion of trade secret protection is a growing trend at every stage of criminal cases, from policing to parole.
Wexler also cited the case of Martell Chubbs, who was denied access to the source code for a forensics program that was used to convict him.[2] The California Court of Appeals ruled for the developer of the program. In the process, it likely became the first appellate court in U.S. history to extend a trade secret evidentiary privilege in a criminal case.
Shockingly, the privilege was not one that entitled the developer to a protective or sealing order, which would have made sense. Instead, it allowed the developer to "entirely" withhold the source code. As Wexler points out, the Chubbs case has thus formed the basis nationwide for a new body of case law which denies access to the underlying source code of algorithms used throughout the criminal justice system.
Wexler also noted that the issue is not receiving the attention it warrants. She concluded by suggesting that builders and users of such algorithms should no longer be permitted to hide behind the smokescreen of trade secrets, and called for state governments to take immediate action to prevent this from occurring in the future.
As disturbing as the notion is that defendants are denied the right to look at the very data and source code that is being used to keep them incarcerated, is the fact that the public is locked out as well. This prevents not just defendants from doing their own analyses of these systems, but also everyone else. This includes countless university researchers eager to perform critical research to determine whether these systems actually work and what their inherent flaws might be. There is no accountability whatsoever, which is a huge problem. At even the most basic level, it is impossible to check the math of these systems and verify that they were properly built.
As a legislator in the Idaho House of Representatives, I knew the time had come to do something about the situation. Last year, there was a movement in my state, which encompassed the Idaho Supreme Court, to eliminate the right to bail by changing the state constitution and creating a system of risk-based preventative detention. Key to the new scheme was use of the aforementioned pretrial risk assessment algorithms. Because it was obvious that this was going to be a continuing hot-button issue, it was clear that there was an urgent need to get things right before we took such a drastic step.
A part of the process concerned getting additional background on pretrial risk assessments. One article, "Assessing Risk Assessment in Action,"[3] authored by Megan T. Stevenson, an assistant professor at the Antonin Scalia Law School at George Mason University, revealed the truth behind some commonly held beliefs about the algorithms.
Stevenson analyzed data from Kentucky, which was using Public Safety Assessment, a proprietary algorithm, developed by the Laura and John Arnold Foundation (now Arnold Ventures). She found that rather than causing mass decarceration, in practice there was only a trivial decrease in the jail population, and noted an uptick in failure to appear rates and pretrial crime. While risk assessments may have worked in some jurisdictions, evidence would seem to indicate that the upside from their use may be outweighed by their shortcomings.
Another significant document on the subject is a 2018 statement from The Leadership Conference on Civil and Human Rights. Representing 100 national civil rights groups, the organization stated, "we believe that jurisdictions should not use risk assessment instruments in pretrial decision making."[4]
While banning assessments altogether is worth considering, such a move would probably be going a bit too far. Rather, it is in everyone's best interests to take heart from what the groups opined on the issue. They advised that "If in use, a pretrial risk assessment instrument must be designed and implemented in ways that reduce and ultimately eliminate unwarranted racial disparities across the criminal justice system."
Anyone examining the assessments in use across the nation, including my state of Idaho, would see that there is scant evidence that they have been tested for racial or other bias. It would also be clear that they were not designed or implemented in a manner that would reduce racial disparities.
The Leadership Conference on Civil and Human Rights also stated that in order to be implemented, pretrial risk assessment instruments must be transparent, validated through independent means and open to challenge by a defendant's legal counsel. In addition, they considered it critical that the public be privy to the design and structure of the tools, reiterating that they be transparent to anyone and everyone.
Based on these rock-solid principles, along with the suggestions offered by professor Wexler, I drafted legislation to address and fix these problems in my state of Idaho. It was also my great hope that in the process, a national conversation on the subject would emerge. To be sure, these issues are not going away by themselves.
The bill I introduced, Idaho House Bill 118, would have required that pretrial risk assessment algorithms prove to be free of bias against protected classes prior to being implemented. It would also have eliminated trade secret protections for proprietors of the algorithms and required that all underlying data and source code be open to academic researchers, as well as the public.
Surprisingly, the biggest pushback against the legislation came when it was revealed that the tools could not be shown to be bias-free. Incredibly, the users of the algorithms basically admitted that they were likely to be biased, and thus, if legislation were to pass, would be eliminated. I did not consider this to be a positive outcome. I have held out a degree of hope that these algorithms, with more research and scholarship, could be built in ways that may not increase or magnify the existing bias in the criminal justice system.
Accordingly, my colleagues in the Idaho House and I settled on eliminating trade secret protection, while forcing transparency behind the construction of the algorithms. Collectively, we considered this a critical step that would help us answer the questions we harbored. Subsequently, during hearings on the legislation in the House Judiciary Committee, Michael Ekstrand, a professor from Boise State University, expressed keen interest in performing scholarly research on the subject — provided he was able to obtain the heretofore secret data. We were intrigued by the notion that we might have a vehicle by which we could at last learn the truth about the degree of systemic bias.
The only opposition to the legislation came from the Idaho Association of Criminal Defense Lawyers. They noted that a deal had already been made with the state Supreme Court and the Idaho Criminal Justice Commission to run a package of legislation in 2020 that would expand the use of pretrial risk assessments statewide. In the process, it would eliminate the constitutional right to bail in Idaho in favor of exclusive use of algorithms. While seemingly disheartening, it immediately became clear that this only added fuel to the fire behind our bill from the majority of my fellow legislators. It was apparent that we would be asked to make an insurmountable leap with evidence next year — one that would only make the problem worse and create widespread electronic discrimination.
Then, something unexpected happened. I believed that proprietors of the algorithms, desiring to protect their trade secrets, would vigorously oppose House Bill 118. Not only did that not happen, but in fact the opposite occurred.
John Arnold, speaking on behalf of Arnold Ventures, agreed with me — that it was time to bring transparency and end protections for intellectual property used in the criminal justice system. Arnold said, "We agree with this proposal from Idaho that risk assessment tools are too important to be black boxes. The methodology must be transparent and open to public inspection, auditing and testing."
This was an astounding turn of events. What he said next was even more surprising. "One effect of this proposal, for better or for worse, is the private sector will not be able to protects its IP, and thus may not be active in risk assessments. The answer is for philanthropy to create these tools that jurisdictions are asking for while making them fully transparent." Arnold's position was now in lockstep with the vast majority of legislators in my state. Despite my preconceptions, I commend Arnold for having the courage to stand with us on this issue, perhaps to the detriment of his own business.
On March 4 of this year, House Bill 118 passed the Idaho House of Representatives by a vote of 66-2-2. On March 18, the Senate also passed the bill, with a tally of 33-0-2. Out of 105 legislators in my state, 99 voted in favor of this legislation. Although nothing is ever guaranteed in politics, there is a prevailing belief that Gov. Brad Little will sign the bill into law, making Idaho the first state to eliminate trade secret protections in criminal justice and forcing algorithmic transparency.
House Bill 118 was a great start to an important societal issue. While it has personally been a fascinating subject to tackle, our work has only just begun. As a society, we cannot allow black box technologies to trammel on liberties and impermissibly discriminate against people — certainly not in secret.
I have made it my personal mission to force fundamental change to our current system. In the spirit of true justice, I call on all judges, lawyers and other professionals in the criminal justice system to join with me. It is too important an issue for us to ignore any longer.
Greg Chaney is a member of the Idaho House of Representatives.
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the firm, its clients, or Portfolio Media Inc., or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
[1] https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2920883
[2] See People v. Superior Court (Chubbs), No. B258569, 2015 WL 139069, at *3, *7, *9-10 (Cal. Ct. App. Jan. 9, 2015).
[3] https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3016088
[4] http://civilrightsdocs.info/pdf/criminal-justice/Pretrial-Risk-Assessment-Full.pdf