Product details

By continuing to use our site you consent to the use of cookies as described in our privacy policy unless you have disabled them.
You can change your cookie settings at any time but parts of our site will not function correctly without them.

Abstract

This is part of a case series. This case is part of the Giving Voice to Values (GVV) curriculum. In this case, Timothy Brennan is the founder and CEO of technology company Northpointe, Inc (Northpointe), and the creator of its flagship software program, COMPAS, an Artificially Intelligent software tool for US court systems that predicts a defendant's likelihood to reoffend and informs bail, parole, and probation sentencing decisions. Brennan originally created COMPAS in order to standardize decision-making within the criminal justice system and to reduce the likelihood of human error or bias impacting court rulings. However, years after COMPAS's public release and widespread adoption within the US court systems, an investigative journalism report claims that COMPAS is more likely to mislabel Black defendants as higher risk and White defendants as lower risk of recidivism. To complicate the matter, any coding adjustments that Northpointe would make to uncover or address the bias-causing programming might reduce the software's performance or reveal sensitive operational information to competitors. In the A case, Brennan's challenge is to organize a response to investigate bias within the COMPAS software, while still protecting the complexity and intellectual property of the product. In this B case, students read a synopsis of Brennan's actual response and review its implications for Northpointe and the US criminal justice system. They are encouraged to consider how Brennan could have responded more creatively and constructively.

About

Abstract

This is part of a case series. This case is part of the Giving Voice to Values (GVV) curriculum. In this case, Timothy Brennan is the founder and CEO of technology company Northpointe, Inc (Northpointe), and the creator of its flagship software program, COMPAS, an Artificially Intelligent software tool for US court systems that predicts a defendant's likelihood to reoffend and informs bail, parole, and probation sentencing decisions. Brennan originally created COMPAS in order to standardize decision-making within the criminal justice system and to reduce the likelihood of human error or bias impacting court rulings. However, years after COMPAS's public release and widespread adoption within the US court systems, an investigative journalism report claims that COMPAS is more likely to mislabel Black defendants as higher risk and White defendants as lower risk of recidivism. To complicate the matter, any coding adjustments that Northpointe would make to uncover or address the bias-causing programming might reduce the software's performance or reveal sensitive operational information to competitors. In the A case, Brennan's challenge is to organize a response to investigate bias within the COMPAS software, while still protecting the complexity and intellectual property of the product. In this B case, students read a synopsis of Brennan's actual response and review its implications for Northpointe and the US criminal justice system. They are encouraged to consider how Brennan could have responded more creatively and constructively.

Related