And, according tothe popular media, they have no idea how it works!

And Id like to sell you an NFT of the Brooklyn Bridge.

Prediction and identification are two entirely different things.

No, MIT’s new AI can’t determine a person’s race from medical images

When a prediction is wrong, its still a prediction.

When an identification is wrong, its a misidentification.

These are important distinctions.

AI models can be fine-tuned to predict anything, even concepts that arent real.

40% off TNW Conference!

Because lemons dont have aliens in them.

The MIT model achieves less than 99% accuracy on labeled data.

This would defeat the purpose of using AI in the first place.

Its important for AI developers to consider the potential risks of their creations.

But this particular warning bears little grounding in reality.

But this AI cant determine race.

It predicts labels in specific datasets.

And the more images the AI processes, the more mistakes its certain to make.

In summation: MITs new AI is nothing more than a magicians illusion.

This AI can only predict labels.

It doesnt identify race.

Also tagged with