Its difficult to tell whether wide-spread use of predictive policing AI is the result of capitalism or ignorance.

AI cannot predict crime; its ridiculous to think it could.

What itcando is provide a mathematical smoke-screen for unlawful police practices.

Predictive policing is a scam that perpetuates systemic bias

And it does this very well, according to AI experts.

Think about that for a second.

Thats the very definition of inherent systemic bias.

What if we mixed the rat feces-infused water with flour to make dough and baked bread sticks?

Dirty data is the rat feces of the machine learning world.

But the real problem is ignorance.

People seem to think AI has mysterious fortune-telling powers.

Artificial intelligence can predict the future no better than a Magic 8-ball.

At least the 8-ball gives you a fair shake.

The point is: when AI systems predict, theyre guessing.

Well explain….

40% off TNW Conference!

You train it on one million images of peoples faces.

You look over the results and determine it was correct 32 percent of the time.

That simply wont do.

You tweak the algorithm and run it again.

Well say 97 percent is what you were going for.

Your neural data pipe can now determine with 97 percent accuracy whether a person likes chocolate or vanilla.

Except, it cant.

It cannot tell whether a person likes chocolate or vanilla more.

Its an idiot system.

Artificial intelligence has no awareness.

If you feed dirty data to a system, it will give you whatever results you want.

Whats dirty data?

In the law enforcement community this is any data thats derived from illegal police practices.

Thats enough to raise some eyebrows.

Crime data is subjective.

Remember the chocolate or vanilla example?

Thats what these AI startups do.

The Washington University Law Review also investigated predictive policing.

When these individuals did commit a crime, they were punished more severely.

Kansas City is currently in the midst ofa police scandal.

Its safe to assume theres some dirty data in that mix somewhere.

In New Orleans a company called Palantir provided predictive policing software to police in secret.

The solution

Theres simply no way for vendors of predictive policing systems to compensate for bad data.

Theres no such thing as a universal rat-shit filter for AI systems.

Well have dirty data as long as there are biased law enforcement officers.

The only solution is to extricate black box systems from the justice system and law enforcement communities entirely.

As Dr. Martin Luther King Jr. said: an injustice anywhere is a threat to justice everywhere.

Also tagged with