2019-1-29 15:08 |
Our lives are increasingly affected by algorithms. People may be denied loans, jobs, insurance policies, or even parole on the basis of risk scores that they produce. Yet algorithms are notoriously prone to biases.
For example, algorithms used to assess the risk of criminal recidivism often have higher error rates in minority ethic groups. As ProPublica found, the COMPAS algorithm – widely used to predict re-offending in the US criminal justice system – had a higher false positive rate in black than in white people; black people were more likely to be wrongly predicted to re-offend. Findings such as these…
This story continues at The Next Web
.
Similar to Notcoin - Blum - Airdrops In 2024