- Limiting Parole: A new law pushed by Louisiana Governor Jeff Landry cedes much of the power of the parole board to an algorithm that prevents thousands of prisoners from early release.
- Immutable Risk Score: The risk assessment tool, TIGER, does not take into account efforts prisoners make to rehabilitate themselves. Instead, it focuses on factors that cannot be changed.
- Racial Bias: Civil rights attorneys say the new law could disproportionately harm Black people in part because the algorithm measures factors where racial disparities already exist.
The angle makes complete sense if you understand it: A reason that “AI” automation is bad is because it labels blind 70-year-olds as dangerous.
Blind 70 year olds can still be dangerous. Being blind and old doesn’t prevent that.
They’re not saying that offloading the responsibility to an algorithm is good, they’re saying it’s weird to assume a person is harmless based on nothing but two attributes.
I agree on a general basis that it’s bad that these kind of decisions are offloaded to an AI. A human should be the one to consider whether the blind 70 year old is dangerous, because they definitely can be.
Operating a vehicle or weapon requires neither eyesight nor a clear mind if you don’t intend to do it safely.