2019-2-7 04:02 |
Here’s a choose-your-own-adventure game nobody wants to play: you’re a United States judge tasked with deciding bail for a black man, a first-time offender, accused of a non-violent crime. An algorithm just told you there’s a 100 percent chance he’ll re-offend.
With no further context, what do you do? Judges in the US employ algorithms to predict the likelihood an offender will commit further crimes, their flight risk, and a handful of other factors. These data points are then used to guide humans in sentencing, bail, or whether to grant (or deny) parole. Unfortunately the algorithms are biased 100 percent…
This story continues at The Next Web
.
Similar to Notcoin - Blum - Airdrops In 2024