Fairness in the COMPAS Recidivism Risk Algorithm

Created by Alyssa Sugarman, Eva Newsom, and Sammy Raucher

Judge's mallet in a digital network

The "What Makes an Algorithm Fair?" Curriculum Package invites students to evaluate the COMPAS algorithm according to different definitions of fairness and, in the process, see the algorithm's place in the context of the history of pre-trial assessment and the institution of criminal justice. 

This project was realized with the funding support from the Mozilla Responsible Computer Science Challenge grant.


If you had to decide between the three definitions of fairness above, which definition do you think would make “fair” decisions for everyone who goes through the court system? What values did you consider as you made this decision?

Question 9a

Algorithmic Fairness: Considering Different Definitions

Jupyter Notebook(link is external) (requires upper-division computing)

[a version of this Jupyter Notebook for students with no data science and computing experience will be uploaded February 2021]

Grading rubrics and solutions available for instructors upon request (hce@berkeley.edu(link sends e-mail)).

Learning outcomes

By working through this notebook, students will be able to:

  • Identify how algorithms exist as part of socio-technical systems - as in, they are technological tools that are created by and interact with humans in social institutions.
  • Recognize that there are different definitions of fairness that can be applied to evaluate algorithms, and critically examine how these definitions are emerging from and support different sociotechnical imaginaries of justice. 
  • Understand how definitions of fairness are co-produced with racist systems/institutions. 
  • Recognize the limits and opportunities of technical solutions to problems of algorithmic bias, and what historical and institutional contexts need to be considered to aim for fairness in algorithm design and deployment.
  • Consider what kinds of community knowledge and other professional expertise besides their own is necessary to think through socio-technical issues.
  • Consider what issues that are relevant to the COMPAS ecosystem (i.e. criminal justice system, affected communities) but outside of the algorithm itself, need to be addressed to be able to create a more just system, with or without the algorithm.

Play video

Mini-lecture: Algorithms and Pretrial Assessment

HCE Toolkit Connection

HCE Toolkit Connection

Expertise; Sociotechnical System; Sociotechnical Imaginaries; Identity/Positionality

Play video

Authors of "What Makes an Algorithm Fair?" share their perspectives on what is important to creating HCE curriculum

Adoptions and Adaptations

Material from this Curriculum Package is intended for DATA 4AC: Data and Justice and DATA 102: Data, Inference, and Decisions