Authored By: Saachi Dhingra
Vivekananda Institute of professional studies
Overcrowding in prisons, high rates of recidivism, and a lack of resources have become increasingly pressing issues for criminal justice systems around the world in recent years. In response, decision-making procedures have been supported by developing risk assessment systems such as COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). Compensatory offender assessment and rehabilitation (COMPAS) was first introduced in the United States and is intended to determine offenders’ recidivism probability.
Though these techniques have many advantages, they also raise important moral and legal questions, especially concerning fairness, openness, and the so-called “black box” phenomena. Northpointe (now Equivant) developed a program called COMPAS that uses a proprietary algorithm to predict the chance of recidivism for offenders.
COMPAS creates risk ratings by examining several variables, such as a person’s criminal history, social networks, and behavioral patterns. Judges, parole boards, and correctional staff use these scores to decide on bail, sentence, and parole.
The instrument has some essential parts:
- Evaluation of Risk: According to COMPAS, there are three main risks: violent recidivism, pretrial misconduct, and general recidivism.
- Needs assessment: To direct rehabilitative therapies, it identifies criminogenic needs, such as substance abuse or mental health difficulties.
- Criminal History and Social Factors: To create a complete picture, the algorithm also takes into account the user’s age, work status, previous offences, and relationships with family and peers.
The capacity of COMPAS to provide data-driven insights regarding offender risk has been praised. This capability may result in more knowledgeable, impartial, and consistent court rulings. However, given its opacity and the ethical issues surrounding its use raises concerns.
The Mystery of the Black Box
A “black box” is a system or process that has visible inputs and outputs but obscure or unintelligible internal workings that determine how the system gets a particular result. The black box dilemma occurs when algorithmic risk assessments, such as COMPAS, use proprietary underlying decision-making algorithms that are not open to outside inspection.
A. lack of transparency
COMPAS’s lack of openness is one of the main complaints leveled against it. Defendants, their attorneys, and even judges are unable to completely comprehend the process by which risk scores are calculated since COMPAS is dependent on a proprietary algorithm. Due process is seriously compromised by this lack of openness since people are being assessed using standards that are not made known to them.
In the case of State v. Loomis, this issue was raised by the defendant, who claimed that his right to due process was infringed by the use of COMPAS in sentencing because he was unable to contest the algorithm’s methodology. While the Wisconsin Supreme Court maintained the application of COMPAS, it also required judges to provide judges with warnings about the instrument’s limits and stressed that the tool shouldn’t be the only factor considered when determining a sentence.[1]
The accountability issue is brought up by COMPAS’s “black box” design. If an algorithm determines a defendant’s high-risk score, but its inner workings are hidden, how can the system make sure the outcome is just and fair?
B. Fairness and Bias in Algorithms
Potentially ingrained biases in the algorithm are a major worry regarding the black box phenomena. Several studies have demonstrated that minorities may be disproportionately classified as high-risk offenders by COMPAS. For example, even though white defendants were more likely to commit new crimes, ProPublica’s analysis showed that Black defendants were more likely to be mistakenly labeled as greater risk than their white counterparts[2].
The fairness of employing private algorithms in criminal justice is called into doubt by these prejudices. Injustices may result, especially for underrepresented groups, if historical and societal prejudices distort COMPAS’ risk scores.
COMPAS Implementation in the United States
In the United States, states including Wisconsin, New York, and California make extensive use of COMPAS. It is used in some criminal justice processes, from pretrial conferences to judgments made after release from jail.
A. Decision-Making Before Trial
COMPAS is used to determine whether an accused person should be detained or released on bond. Courts can make more informed decisions about the possible risk that an individual poses to society if they are released thanks to its predictive capacity for pretrial misconduct.
B. Penalties and Release
COMPAS scores are frequently used by judges to determine punishment. The goal of the tool is to give decision-makers an objective way to gauge whether someone should go through probation or rehabilitation instead of a jail sentence. Similar to this, parole boards evaluate an offender’s suitability for early release using COMPAS.
C. Rebukes and Legal Disputes
Although COMPAS is praised for supporting evidence-based decision-making, it has also encountered several legal difficulties, most of which have to do with fairness and due process. One of the most well-known cases illustrating the conflict between constitutional rights and technical progress is the Loomis case.
International Use of Risk Assessment Instruments: Insights from Various Legal Frameworks
The United States is not the only country that uses risk assessment instruments like COMPAS. Lessons on how such systems might be incorporated into various legal frameworks have been learned from other nations that have implemented such instruments with differing degrees of success.
A. The United Kingdom
Similar functions are performed by risk assessment instruments in the UK, such as the Offender Assessment System (OASys). Nonetheless, the UK’s strategy places a strong emphasis on judicial monitoring and openness, which allays some of the complaints related to the black box issue.
B. Canada
The Level of Service/Case Management Inventory (LS/CMI) is used in Canada to evaluate the risk of recidivism. The Canadian system is unusual in its consideration of the special circumstances of Indigenous offenders, reflecting a culturally sensitive approach to criminal justice[3]
C. Australia
Similar instruments are used in Australia, where the Risk-Needs-Responsivity (RNR) model guides the distribution of rehabilitative resources and evaluates the likelihood of reoffending. Rehabilitative tactics and risk assessment are integrated into Australia’s system, which makes it noteworthy and guarantees that offenders receive individualized interventions depending on their requirements.[4]
COMPAS Significance within the Criminal Justice System of India
With its overworked courts and overcrowded jails, India’s criminal justice system is ready for innovation. Using a risk assessment tool such as COMPAS could offer much-needed relief by decreasing recidivism and enhancing court efficiency. But there are a few issues that need to be resolved.
A. Possible Advantages
COMPAS has the potential to greatly improve India’s parole and pretrial decision-making processes. The existing system frequently makes subjective decisions, which results in uneven outcomes for parole and bail. More uniformity and fairness in these choices could be achieved with the use of an objective risk assessment tool.
B. Difficulties and Hazards
There might be difficulties in introducing COMPAS in India. Due to the wide socioeconomic differences in the nation, there is a chance that the tool will reinforce pre-existing prejudices, especially those directed toward marginalized communities. Furthermore, India does not have the extensive criminal databases needed to feed these algorithms.
Moreover, the black box problem would pose a serious challenge in India, where judicial decision-making transparency is crucial. Without a thorough understanding of how COMPAS determines its risk ratings, it might face substantial legal challenges similar to those seen in the U.S.
Legal and Ethical Considerations: Accountability, Due Process, and Transparency
There are a number of moral and legal issues surrounding the use of algorithmic tools in criminal justice that need to be carefully considered.
A. Openness and the Right to a Just Trial
COMPAS’ opaque algorithm goes against the core values of accountability and transparency in the legal system. A defendant needs to be aware of the facts against them and how a risk score is determined in order to properly build a defence. This right is compromised by COMPAS’s lack of transparency, which also poses grave issues with due process.[5]
B. Discrimination and Bias
Studies have demonstrated that COMPAS can disproportionately impact minority communities, as was previously mentioned. There may have been infractions of the equal protection principles as a result of this. To deal with this, any algorithm used in criminal justice must be rigorously tested for biases and subjected to independent oversight.
Conclusion: The Future of COMPAS and Algorithmic Tools in Criminal Justice
The introduction of algorithmic tools like COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) into criminal justice marks a significant technological shift, promising to streamline processes and bring data-driven objectivity to critical decisions related to sentencing, bail, and parole. However, the path forward for these tools is complex and filled with challenges. Their continued use brings up significant legal, ethical, and practical concerns that must be addressed to ensure that they contribute to fairness and justice, rather than entrenching existing disparities or undermining individuals’ rights.
I. Balancing Technological Efficiency and Fundamental Rights
One of the key benefits of algorithmic tools like COMPAS is their ability to deliver more consistent and objective decision-making, particularly in systems strained by heavy caseloads, overcrowded prisons, and limited resources. By assessing recidivism risk and helping guide pretrial release decisions, COMPAS offers an opportunity to alleviate some of this pressure. However, a fundamental question arises regarding how these technological advancements align with the preservation of key rights, such as the right to a fair trial and the ability to challenge the evidence presented.
The opaque nature of COMPAS and similar tools—often referred to as the “black box” phenomenon—has drawn significant criticism. In the notable case State v. Loomis, the defendant argued that COMPAS violated his due process rights because he couldn’t fully understand or contest how the risk score was determined. Although the court ultimately allowed the use of COMPAS, it did so with caution, recommending that judges not rely on it exclusively and make note of its limitations.
Moving forward, the legal system will need to strike a delicate balance between the efficiency provided by these tools and the preservation of fundamental rights. Courts and legal bodies must prioritize transparency and accountability when using algorithmic tools. Defendants and their legal representatives need access to the methods behind these risk assessments to ensure that decisions made with these tools can be questioned and properly scrutinized.
II. Addressing Algorithmic Bias
One of the most troubling concerns surrounding algorithmic tools like COMPAS is the potential for bias. While these tools are often marketed as neutral, their outcomes are largely influenced by the data on which they are trained, which can reflect existing social and racial inequalities. For instance, investigations such as those conducted by ProPublica have shown that COMPAS tends to label Black defendants as high-risk more frequently than white defendants, even when the actual likelihood of reoffending is similar or lower. This raises concerns about whether these tools unintentionally perpetuate systemic racial biases.
Bias in algorithmic tools is a widely recognized issue, but its impact on criminal justice is particularly concerning. The implementation of COMPAS and similar systems must include thorough testing and regular audits to ensure that their predictions do not deepen existing disparities. This is not simply a technical challenge—it is a justice issue. Racial and socioeconomic inequalities are already significant in criminal justice systems worldwide, and biased algorithms could further embed these injustices.
To address this risk, jurisdictions that use tools like COMPAS must establish independent bodies that regularly audit and monitor the algorithms for signs of bias or inaccuracy. Both government agencies and private companies responsible for developing these tools need to be held accountable to ensure that their products do not harm the individuals they are intended to help. The datasets these algorithms rely on must be continuously evaluated to ensure that they represent the diversity of the populations they assess.
III. Global Adoption and Key Lessons
While COMPAS is primarily used in the United States, other countries are also experimenting with similar risk assessment tools in their criminal justice systems. Jurisdictions such as the United Kingdom, Canada, and Australia have adopted their own versions of these tools, and their experiences provide valuable lessons for the future development and use of COMPAS.
In the United Kingdom, for instance, the Offender Assessment System (OASys) helps evaluate reoffending risks and guide rehabilitation strategies. However, unlike the U.S., the UK has placed greater emphasis on transparency and judicial oversight, which could serve as a model for improving tools like COMPAS. Judicial review of these tools can help mitigate concerns about fairness and due process, ensuring that individuals’ rights are upheld.
Similarly, Canada’s Level of Service/Case Management Inventory (LS/CMI) incorporates a culturally sensitive approach to offender risk assessment, particularly for Indigenous communities. This underscores the need for algorithmic tools to consider the social and cultural contexts of those being assessed, thereby addressing criticisms of bias and inequality in risk scoring.
In countries like India, where the criminal justice system is overburdened and cases suffer from prolonged delays, the use of a risk assessment tool similar to COMPAS could potentially bring substantial improvements. However, given India’s socioeconomic disparities and diverse population, introducing such tools could also amplify existing biases against marginalized groups, such as lower-income communities or minorities. Policymakers should carefully study global examples to ensure that these tools enhance fairness and do not disadvantage already vulnerable populations.
V. The Need for Regulatory and Legislative Frameworks
As algorithmic tools continue to be integrated into criminal justice systems, there is a growing need for robust regulatory and legal frameworks to govern their use. These regulations should address concerns related to transparency, accountability, data privacy, and the right to contest algorithmic decisions.
In the United States, regulatory oversight is still developing, with courts and lawmakers only beginning to navigate the complexities of algorithmic decision-making. While *Loomis* offers some judicial guidance on how to approach tools like COMPAS, broader legislative action is needed. Courts cannot be expected to bear the full burden of regulating these technologies—lawmakers must enact rules that protect the rights of defendants and ensure fairness in the application of these tools.
Moreover, the protection of personal data is crucial. Tools like COMPAS rely heavily on individual data, and without proper safeguards, there is a risk that this data could be misused. As countries like India and others around the world develop their own data protection laws, it is vital to ensure that the use of algorithmic tools in criminal justice is appropriately regulated to guard against privacy violations.
VI. The Essential Role of Human Oversight
While algorithms can offer valuable insights to aid decision-making, human oversight is essential. Judges, parole boards, and other legal authorities must retain ultimate responsibility for ensuring just outcomes. Algorithms should complement, not replace, human judgment by providing data to inform decisions.
The future success of COMPAS and other algorithmic tools depends on how well they are integrated into the broader justice system. While they can reduce human error and inconsistency, human oversight remains crucial in ensuring that these tools are used fairly. Training judges and legal professionals on how to interpret and appropriately use these tools will be key to maintaining a balance between technological innovation and traditional legal principles.
Conclusion
The future of COMPAS and similar algorithmic tools in criminal justice holds promise but is fraught with challenges. If carefully implemented, these tools have the potential to reduce recidivism, increase judicial efficiency, and provide greater objectivity in decision-making. However, their success hinges on addressing issues of transparency, bias, accountability, and the preservation of human oversight. Ensuring that these tools complement rather than compromise justice will require collaboration between courts, legislators, and policymakers. Lessons from other jurisdictions should guide their development to create a more fair and equitable criminal justice system for all.
Reference(s):
[1] State v Loomis, 881 N.W.2d (Wis 2016)
[2] Julia Angwin et al., Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Blacks, PROPUBLICA (May 23, 2016), https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
[3] R. v. Gladue, [1999] 1 S.C.R. 688 (Can.)
[4] Andrew Day & Sharon Casey, The Application of the Risk-Need-Responsivity Model to Offender Assessment and Treatment Planning for Australian Offenders, 43 AUST. PSYCH. 201 (2008).
[5] Rebecca Wexler, Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System, 70 STAN. L. REV. 1343 (2018)