Can AI technology interpret human emotions?

Can AI technology interpret human emotions?

16. Juni 2021 0 Von Horst Buchwald

Can AI technology interpret human emotions?

Berlin, 16.6.2021

There’s no doubt about it: more and more people are rejecting facial recognition through the use of AI. The reason: the alleged purpose is only an illusion, the actual application goal is completely different. One hotly debated topic is human emotions. Even from person to person, it is often impossible or even difficult to interpret the emotions of our fellow human beings correctly. So why should software do a better job? If you look behind the facade, you often start to wonder.

Here are a few examples to get you thinking:

Three Italian art museums are now using AI-powered camera systems to „read“ their visitors‘ facial expressions and determine which artworks are most popular. Critics point to privacy issues and unverified and dubious claims that AI technology is capable of interpreting human emotions.

The facial recognition technology, visible as cameras mounted on walls along with a disclaimer, is being used in museums in Rome, Bologna and Parma.

The technology is reportedly able to read five facial expressions – happy, sad, neutral, surprised and angry – as well as a person’s gender, age and eye movements. According to Riccardo Scipinotti, co-developer of the technology, most of the facial expressions interpreted so far are neutral. The system is privacy-friendly because it does not store any of the images. Instead, it only outputs numbers for further analysis.

Earlier this year, a new website, Emojify, launched to draw attention to the flaws of AI emotion recognition, which uses algorithms to scan a person’s face and „find“ emotions. Critics argue that this is inaccurate, biased, and limited. Facial expressions are much more nuanced, they say. A grimace or smile, for example, could mean different things to different people (and cultures), they say.

Still, it’s already used in China to monitor behavior in schools and interrogate suspects. Some companies also use it to assess job applicants and read customers‘ faces to identify their likes and dislikes.