un sito di notizie, fatto dai commentatori

Anche la tecnologia è prevenuta. Come correggerla? [EN]

0 commenti

A cura di @cris.

Un pezzo di FiveThirtyEight discute le discriminazioni a cui può portare l’uso indiscriminato di algoritmi e intelligenze artificiali, in particolare nel settore della giustizia:

Consider COMPAS, a widely used algorithm that assesses whether defendants and convicts are likely to commit crimes in the future. The risk scores it generates are used throughout the criminal justice system to help make sentencing, bail and parole decisions.

At first glance, COMPAS appears fair: White and black defendants given higher risk scores tended to reoffend at roughly the same rate. But an analysis by ProPublica found that, when you examine the types of mistakes the system made, black defendants were almost twice as likely to be mislabeled as likely to reoffend — and potentially treated more harshly by the criminal justice system as a result. On the other hand, white defendants who committed a new crime in the two years after their COMPAS assessment were twice as likely as black defendants to have been mislabeled as low-risk. (COMPAS developer Northpointe — which recently rebranded as Equivant — issued a rebuttal in response to the ProPublica analysis; ProPublica, in turn, issued a counter-rebuttal.)

Immagine da Pixabay.


Commenta qui sotto e segui le linee guida del sito.