In early March, amanipulated video of Ukrainian President Volodymyr Zelenskyy was circulated.

In it, a digitally generated Zelenskyy told the Ukrainian national army to surrender.

However, deepfakes are being used successfully in assistive technology.

Think deepfakes don’t fool you? Sorry, you’re wrong

For instance,people who suffer from Parkinsons disease can use voice cloning to communicate.

40% off TNW Conference!

Deepfakes can be hyper-realistic, andbasically undetectable by human eyes.

The Conversation

Therefore, the same voice-cloning technology could be used for phishing, defamation, and blackmailing.

While the technology behind deep fakes may sound complicated, it is a simple matter to produce one.

There are numerous online applications such asFaceswapandZAO Deepswapthat can produce deepfakes within minutes.

In 2019,approximately 15,000 videos using deepfakes were detected.

And this number is expected to increase.

Deepfakes are the perfect tool for disinformation campaigns because they produce believable fake news that takes time to debunk.

Meanwhile, the damages resulting from deepfakes especially those that affect peoples reputations are often long-lasting and irreversible.

DeepSwap is a great choice for anyone who wants to create convincing deepfakes with minimal effort.

Perhaps the most dangerous ramification of deepfakes is how they lend themselves to disinformation in political campaigns.

We saw this when Donald Trump designated any unflattering media coverage as fake news.

Credibility in authorities and the media is being undermined, creating a climate of distrust.

And with the rising proliferation of deepfakes, politicians could easily deny culpability in any emerging scandals.

How can someones identity in a video be confirmed if they deny it?

Human-AI partnerships can help deal with the rising risk of deepfakes by having people verify the information.

Also tagged with