Artificial intelligence (AI) has the potential to bring about a technological renaissance. Although it can’t properly mimic human behavior or thought just yet, it is trouncing us in one particular area: pattern recognition.
We’re pretty good at recognizing patterns, sure – after all, that’s essentially what basic scientific thought is. AIs, however, are now able to detect breast cancer and pick IVF-suitable embryos more accurately than medical professionals, and a new study suggests that this could apply to recognizing emotions too.
A team at The Ohio State University have applied a method humans automatically seem to use to read emotions to an AI. Ultimately, the AI proved to be better at detecting emotional states in this way than humans – although, rather surprisingly, the AI isn’t even the key finding of the research.
Cognitively processing emotional content is something else entirely, but when it comes to detecting how someone is feeling based on their facial expression, there are multiple visual cues.
One is the hue of someone’s face, which is partly controlled by localized blood flow – something technically known as “vascular response”. As the authors note in their study, these facial blood flow changes match up to the type of expression, and its “valence” – its nebulously defined inherent “good” or “badness”.
Writing in the Proceedings of the National Academy of Sciences, the team’s hypothesis went one step further. Can a person, using blood flow color changes alone, detect the type of emotion and its valence on another person’s face if their facial expression doesn’t change?
In order to test this, they took hundreds of images of 18 facial expressions of 184 people from different genders, ethnicities, and overall skin tones, and quickly found that, via digital analysis, emotions – from simple “happy”, “sad”, and “disgusted” to more nuanced “happily surprised” and “angrily surprised” – fit into color patterns influenced by facial blood flow.
They’re not simple or uniform across the face for each emotion, mind you. When someone feels disgust, for example, the hue around the lips is different from that around the nose and forehead.
In any case, using this complex emotional palette, the team then superimposed various hues corresponding to a range of emotional states onto neutral expressions, and asked a handful of participants to guess how the person was feeling.
Remarkably, most of the time, the participants guessed correctly, including for happy hues (70 percent), sad hues (75), and angry hues (65). This all strongly suggests facial hue is a strong indicator of emotions that we quickly and involuntarily register.
They upped the ante by applying mismatching hues to other expressions too; for example, adding a “sad” hue to a happy expression. Although more difficult, the participants picked the correct emotion most of the time.
“The emotion information transmitted by color is at least partially independent from that by facial movement,” the study concludes. Based on the fact that we have little facial hair compared to our far floofier primate cousins, the study authors suggest that “recent evolutionary forces” have allowed us to transmit emotions in this unique way.
Using this new database, the team generated a basic AI that had an understanding of this emotional palette, and it’s at least as good, and sometimes better, than humanity. According to a press release, it recognized happiness 90 percent of the time, with anger (80), sadness (75), and fear (70) also being frequently detected, based on hues alone.
Once again, all this is pattern recognition, but it’s hard not to be impressed by how easily humans are defeated in this regard. In fact, this AI is so effective that the researchers have already patented and commercialized it.