2023 Theses Bachelor's
The Fairness Fallacy: Northpointe and the COMPAS Recidivism Prediction Algorithm
Recent advancements in machine learning algorithms have gained public attention for a variety of reasons and expanded into almost every domain—including the criminal justice system. These criminogenic algorithms are trained on real-world data to make highly specific predictions about criminal behaviors and personality traits.
The reflection of real-world discrimination within algorithms has posed novel risks to the human rights obligation of non-discrimination under the law, and debates around appropriate choices of quantifiable fairness metrics have remained largely unresolved—and remain entirely unaddressed under international law. This research proposes a preliminary set of requirements for assessing the fairness of criminal justice algorithms that aggregates General Recommendations from the United Nations.
Through a case study of Northpointe, Inc.’s COMPAS recidivism prediction algorithm focusing on its use in the state of New York, these requirements are contextualized in relation to various features of the algorithmic assessment process and computational norms regarding validation studies. This thesis sheds light on features of criminogenic assessment algorithms that may lead to bias against protected groups. Additionally, it highlights major limitations in research, including the proprietary nature of these algorithms, which pose significant obstacles to fully understanding their potential biases.
Geographic Areas
Subjects
Files
- Thomas, Sedona_ Thesis Semester, 2023 - Sedona Thomas.pdf application/pdf 1.64 MB Download File
More About This Work
- Academic Units
- Institute for the Study of Human Rights
- Thesis Advisors
- Holland, Tracey M.
- Degree
- B.A., Columbia University
- Published Here
- October 4, 2023