It may be true that a smile can be worth a thousand words. But understanding what he's hiding is not as easy as it seems. On the contrary: predicting the emotions of others based solely on their facial expressions is almost impossible. This is the word of Aleix Martinez, a professor of computer engineering at Ohio State University who studies the effectiveness with which facial recognition systems are able to interpret human behavior and emotions. His latest research, presented during the recent annual congress of the American Association for the Advancement of Science, represents a clear rejection: the expression printed on someone's face is a bad clue with which to interpret their emotions.
"Our facial expressions vary greatly depending on context, and our cultural background," explains Martinez. "And it's important to realize that not everyone is happy when they smile, and not all happy people smile. I would go even further: many people are not necessarily unhappy when they are not smiling. And if you're happy all day long you don't walk around the street with a smile on your face perpetually: you just be happy.
So a simple smile doesn't tell us much. In some countries, however, smiling is often a good social norm. In others it might make you look like a maniac. Not surprisingly, when Martinez and his team tried to analyze the movements of human facial muscles and relate them to the emotions of a group of volunteers, the results were almost always incorrect. We humans try to read the mood and behavior of our fellow human beings also through their expressions, it's true, but we base our judgments (although always fallible, let's remember that) on a broader set of clues: facial color, posture, movements of arms and legs, context information. For a computer things are different: more and more often we hear about artificial intelligences that use facial recognition to predict human intentions. For example in the field of security, indicating people who are about to commit a crime, or even at school, where there is talk of automatically recognizing students who are not paying attention to the lesson.
A business in which a growing number of companies are launching, with results that Martinez calls disappointing. Together with his team he has in fact tested many of the technologies of this type available today, finding them mostly ineffective. According to the expert, the problem is inherent in the nature of these algorithms: using only facial movements is not enough to read human behavior.
In one of his experiments, for example, he showed a group of volunteers a very close photo of a human face: the mouth open in a mute scream, the muscles stretched, the complexion reddened. "When people looked at the photo they thought: this person must be really annoyed, or particularly angry, to the point of screaming," explains Martinez. "But when they saw the whole picture they found themselves in front of a footballer screaming to celebrate a goal".
In short, judging the next one based solely on his facial expressions is not a good idea. Neither for a human being, nor for a machine. Martinez, however, is optimistic: sooner or later artificial intelligence will become powerful enough to take into account other types of clues, social, postural, etc. ... to get to interpret the intentions of human beings. But always taking into account two important facts about technology in general. "The first - concludes the expert - is that no technology will ever be 100% accurate. And the second is that to decipher a person's intentions it takes much more than a look at the expression on his face: it is something that both the people who work in this field and the algorithms they produce will have to understand".