New York to crack open its code, looking for bias

In 2016, Pro Publica released a study that found that algorithms used across the US to predict future criminals – algorithms that come up with “risk assessments” by crunching answers to questions such as whether a defendant’s parents ever did jail time, how many people they know who take illegal drugs, how often they’ve missed bond hearings, or if they believe that hungry people have a right to steal – are biased against black people. Pro Publica came up with that conclusion after analyzing what it called “remarkably unreliable” risk assessments assigned to defendants:

Read full news article on Naked Security

 


Date:

Categorie(s):