Home
Categories
EXPLORE
Society & Culture
Business
TV & Film
True Crime
News
Comedy
Technology
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts115/v4/39/d1/c7/39d1c707-9487-55b8-5225-e9ae54262292/mza_7129723090907127604.jpg/600x600bb.jpg
Linear Digressions
Ben Jaffe and Katie Malone
291 episodes
9 months ago
In each episode, your hosts explore machine learning and data science through interesting (and often very unusual) applications.
Show more...
Technology
RSS
All content for Linear Digressions is the property of Ben Jaffe and Katie Malone and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
In each episode, your hosts explore machine learning and data science through interesting (and often very unusual) applications.
Show more...
Technology
https://i1.sndcdn.com/avatars-000193869760-sy993g-original.jpg
Racism, the criminal justice system, and data science
Linear Digressions
31 minutes 36 seconds
5 years ago
Racism, the criminal justice system, and data science
As protests sweep across the United States in the wake of the killing of George Floyd by a Minneapolis police officer, we take a moment to dig into one of the ways that data science perpetuates and amplifies racism in the American criminal justice system. COMPAS is an algorithm that claims to give a prediction about the likelihood of an offender to re-offend if released, based on the attributes of the individual, and guess what: it shows disparities in the predictions for black and white offenders that would nudge judges toward giving harsher sentences to black individuals. We dig into this algorithm a little more deeply, unpacking how different metrics give different pictures into the “fairness” of the predictions and what is causing its racially disparate output (to wit: race is explicitly not an input to the algorithm, and yet the algorithm gives outputs that correlate with race—what gives?) Unfortunately it’s not an open-and-shut case of a tuning parameter being off, or the wrong metric being used: instead the biases in the justice system itself are being captured in the algorithm outputs, in such a way that a self-fulfilling prophecy of harsher treatment for black defendants is all but guaranteed. Like many other things this week, this episode left us thinking about bigger, systemic issues, and why it’s proven so hard for years to fix what’s broken.
Linear Digressions
In each episode, your hosts explore machine learning and data science through interesting (and often very unusual) applications.