A new study tested eight out-of-the-box AI algorithms to compare how humans and machines recognize emotions. According to its results, AI outmatches humans when it comes to such emotional expressions as happiness or sadness.
In the study, both people and algorithms had to recognize six different emotions in facial expressions: happiness, sadness, anger, surprise, fear, and disgust. The test was based on 938 videos featuring people with different facial expressions; some of them were posed while the others were spontaneous.
Human participants showed an accuracy of 73 % on average, while that of the eight participating algorithms considerably varied, ranging from 49 % to 62 %.
At the same time, the best performing algorithms recognized happy and sad facial expressions even better or just as well as humans did. Interestingly, the accuracy shown by both people and artificial systems was consistently lower when it came to spontaneous affective behavior, rather than posed expressions (via).
“In fact, in real life there is not and cannot be any reference for a facial expression corresponding with this or that emotion. Some will find your smile joyful, while others will take it for politeness. Therefore, even people with similar cultural background may disagree on what kind of emotion they see in the video. Reaching 100% accuracy is impossible even for humans. Unlike humans though, such systems can process massive amounts of data, and that is their great advantage”, said George Pliev, Founder and Managing Partner of Neurodata Lab, a company that specializes in emotion AI solutions.