“Machine Bias”, screamed the headlines. The tagline said, "There's software used across the country to predict future criminals. And it's biased against blacks."
In a revealing exposé in 2016, ProPublica, a US-based Pulitzer Prize-winning non-profit news organisation, analysed the software known as COMPAS used by US courts and police to forecast which criminals are most likely to re-offend, and found it biased against Afro-Americans.
Guided by inputs from an algorithm, police and judges in America made decisions on defendants and convicts, determining everything from bail amounts to sentences. The report concluded that COMPAS software was twice as likely to falsely label black defendants as future criminals than white defendants.
Further, it was more likely to falsely label white defendants as low risk. In other words, more supposedly high-risk black defendants did not commit crimes. In contrast, more supposedly low-risk white defendants did commit crimes. The allegedly unbiased algorithms were definitely unfair.
You have to be a Premium Subscriber
Start your subscription with a free trial
thefederal.com and thefederal.com and many more features.
plans start from Rs. 99