And theres little reason to believe anyones going to do anything about it.
The problem is thatblackbox AI systemsare a goldmine for startups, big tech, and politicians.
Lets start with the individual officers.

This allows them to take a picture of anyone and surface their identity.
And its as easy to use as Netflix or Spotify.
The reason officers use these systems is because they make their jobs much easier.

They allow a police officer to skip the warrant process and act as judges themselves.
What about police departments and other agencies?
Predictive-policing is among the most common unethical AI systems used by law enforcement.
But, as we all know, you cant predict when or where a crime is going to happen.
All you could do is determine, historically, where police tend to arrest the most people.
What predictive policing systems actually do is give the police a scapegoat for over-policing minority and poor communities.
The bottom line is that you cannot, mathematically speaking, draw inferences from data that doesnt exist.
And there is no data on future crime.
What about other AI systems?
Vice publishedan article todaydetailing the Chicago police departments use of ShotSpotter, an AI system purported to detect gunshots.
According to the company, it can detect gunshots in large areas with up to 95% accuracy.
But in court they claim thats just a marketing guarantee and that background noise can affect the accuracy ofanyreading.
Which means its a blackbox system that nobody can explain, and no legal department will defend.
And in another they had an employee change the designation fireworks to gunshot so you can facilitate an arrest.
When challenged in court, prosecutors merely withdrew the evidence.
To the best of our knowledge nobody was arrested or indicted.
Its that, even if itdidwork, it serves absolutely no purpose.
Have you ever heard a firearm discharge?
And people, unlike blackbox algorithms, can testify in court.
We can hold people accountable.
Ignorance-based capitalist apathy
The reason theres so much unethical cop AI is because its incredibly profitable.
The politicians authorizing the payouts are raking in money from lobbyists.
And the cops using it can ignore our Constitutional rights at their leisure with absolutely no fear of reprisal.
Its a perfect storm of ignorance, corruption, and capitalism.
And its only going to get worse.
The US founding fathers, much like AI, could not predict the future.
And now the same has happened to our Fourth Amendment rights.
Now those protections are gone.
You dont need a predictive algorithm to understand, historically speaking, what happens next.