Digital video surveillance systems cant just identify who someone is.

They can also work out how someone is feeling and what kind of personality they have.

They can even tell how they might behave in the future.

Article image

And the key to unlocking this information about a person is the movement of their head.

That is the claim made by the company behind theVibraImageartificial intelligence (AI) system.

Among other things, these applications include identifying suspect individuals among crowds of people.

The Conversation

They are also used to grade the mental and emotionalstates of employees.

Users of VibraImage include police forces, the nuclear industry and airport security.

The technology has already been deployed at twoOlympic Games, a FIFA World Cup and a G7 Summit.

It’s free, every week, in your inbox.

Across east Asia and beyond,algorithmic security, surveillance, predictive policing andsmart city infrastructurearebecoming mainstream.

VibraImage forms one part of this emerging infrastructure.

Vibraimage has been developed by Russian biometrist Viktor Minkin through his companyELSYS Corpsince 2001.

Other emotion detection systems venture to calculate peoples emotional states byanalysing their facial expressions.

The analysis of facial expressions to identify emotions has come undergrowing criticismin recent years.

Could VibraImage provide a more accurate approach?

Minkin puts forward two theories apparently supporting the idea that these movements are tied to emotional states.

Whats more, Minkin claims this energy can be measured through tiny vibrations of the head.

But the many claims made about its effects seem unprovable.

This research often relies on experiments that alreadyassume VibraImage is effective.

How exactly certain head movements are linked to specific emotional-mental states is not explained.

It may use AI processing in behavior detection or emotion recognition when they have technical necessity for it.

Minkin has also published atechnical responseto my paper.

But there is no compelling evidence that thats the case.

They are opaque, unproven, developed and implemented without democratic input or oversight.

They are also largely unregulated, and possess the potential for serious harm.

VibraImage is not the only such system out there.

Other AI systems to detect suspicious or deceptive individuals have been trialed.

For example,Avatarhas been tested on the US-Mexico border, andiBorderCtrlat the EUs borders.

Both are designed to detect deception among migrants.

The broader algorithmic emotion recognition industry was worth up toUS$12 billion in 2018.

It is expected to reachUS$37.1 billion by 2026.

This is an important start.

Other countries should now follow this lead to ensure that possible harms from these high-risk systems are minimized.

Also tagged with