An algorithm just told you theres a 100 percent chance hell re-offend.
With no further context, what do you do?
Unfortunately the algorithms are biased 100 percent of the time.
![[Best of 2019] Why the criminal justice system should abandon algorithms](https://img-cdn.tnwcdn.com/image?fit=1280%2C720&url=https%3A%2F%2Fcdn0.tnwcdn.com%2Fwp-content%2Fblogs.dir%2F1%2Ffiles%2F2019%2F02%2F06.png&signature=6074c8c50819293d4fe3f1e01041b83e)
Theresearch paperindicates there are several major bias traps that algorithm-based prediction systems fail to overcome.
On the surface, algorithms seem friendly enough.
But even an allegedly unbiased algorithm cant overcome biased data or inconsistent implementation.
Because of this, we have whats called the Ripple Effect Trap.
40% off TNW Conference!
Human intervention doesnt make a logic system more logical.
It just adds data thats based on hunches and experience to a system that doesnt understand those concepts.
The result is an echo-chamber for bias.
Perhaps the biggest problem with algorithms is that theyre based on math justice is not.
This is called the Formalism Trap.
Algorithms are just math, they cant self-correct for bias.
Or you cant have a system that you apply for fair criminal justice results then applied to employment.
How we think about fairness in those contexts is just totally different.
Algorithms are biased towards meeting developers expectations, not fairness.
We can demonstrate this using the Framing Trap.
If the algorithm fails, the researchers will adjust it and hit it one more time.
They keep doing this until they can say that their algorithm meets their clients threshold for success.
This is how black box research works.
Neither these black box systems nor their developers can explain false negatives or false positives.
This brings us to the Framing Trap.
Within the algorithmic frame, any notion of fairness cannot even be defined.
We refer to this abstraction level as the data frame.
This the data frame is where humans come in.
Well hold for applause.
The data used to fuel these black box systems is based on real cases.
so that create clean data wed have to let algorithms make all the decisions in an untouched legal environment.
We need an additional frame to make it overcome this problem: a socio-technical frame.
And that brings us to the final trap: the Portability Trap.
The others are small potatoes compared to this one.
Portability is a deeply-ingrained measure of cleanliness in code.
Basically, a good programmer makes a box and the client decides what to put in it.
You get a box thatll fit them all.
And you’re able to reuse that box later to store books, or human heads.
But thats the opposite of what the justice system should be doing.
You might be wondering how were supposed to fix the problem.
The researchers believe the solution involves critical evaluation and increased interaction between disciplines.
Edit 13:06 CST 2/7/2019:Added context.