New Dutch emotion research

mona lisa analyzedPeter Lewinski, Marieke Fransen and Ed Tan recently made the headlines with the results of their research into automated facial coding of emotions. ACCESS asked Peter Lewinski to tell us more about his project and he kindly sent us this description of the research.

We welcome news about Dutch emotion research on our website – please contact k[dot]steenbergh[at]vu[dot]nl if you would like to share your project with our community.

MEASURING EMOTIONAL ATTITUDES WITH AUTOMATED FACIAL CODING

Nonverbal communication of emotions

Suppose you want to sell your home-made jewelry at the King’s day in Amsterdam. How can you tell in advance whether people like your necklaces and ear rings? Since the earliest scientific inquiries into preferences the only available method has been to ask people how they feel about them. However psychologists found out that when you are asked about your opinion you tend to become self-aware and start to provide socially desirable answers. In other words self-reports emanate from Daniel Kahneman’s –System 2 and this is as slow and logical as it is conscious System. Questionnaires and interviews capture creations and interpretations led by self and social reflective human judgment. The contents of the fast, emotional and subconscious System 1 have long remained elusive. Even if messages delivered by System 1 are ubiquitous in people’s everyday non-verbal behavior, such as gestures, postures, facial expressions and tone of voice. Decoding the messages has been hampered or even forbidden by the subjective and laborious nature of analyses. In the past decade information technology and artificial intelligence have come to the rescue of the direct measurement of emotions. Hopes are high that we can leave considerable work load required for codification of nonverbal behavior to the computer and so steer away from our own interpretation biases. The research community working on the deciphering of facial movements believes that facial expressions convey interculturally shared core affect signatures. They do not need cross-cultural translation as System 2 responses – notably verbal ones – do. Therefore, in our study, recently published in Journal of Neuroscience, Psychology, and Economics, we investigated the predictive value of facial expressions of emotions in response to amusing – i.e. simply funny – video stimuli.

To smile is to “be-with” the stimuli

Emotions and especially facial expressions of emotions are often conceptualized as an access point to people’s inner states. A disgusted face tells people that we do not like the object or person we happen to watch. This is Paul Ekman’s basic emotion conceptualization of facial expressions. In our own work we extend Ekman’s view with the concept of emotional action readiness that was proposed in the first place by Nico Frijda. According to Frijda, facial expressions exhibit inclinations to act. For instance one’s happy face in response to an object, person or situation expresses a readiness to approach, i.e. “be-with” the object or person or to engage into the situation. Starting from this premise, we recently investigated whether facial expressions of persons watching a video ad predicted their liking it. We could demonstrate that they did.

Coding the smile

In order to capture global patterns, we recorded a large sample of people’s facial behavior to our video. Then we used automated facial coding software – FaceReader – to analyze facial behavior, saving us months of manual coding. FaceReader is a system that has built up a huge knowledge base of expressions through machine learning; that is by extracting underlying dimensions from thousands of example expressions. We tallied FaceReader’s output of the expressions exhibited by our test viewers and related it to how much these people liked the video. What we found is that the more people expressed happiness, the more they liked the stimuli. Moreover the more people smiled, the more they liked the agent behind the video, in this case the advertised brand. Hence, following Nico Frijda’s conceptualization, we argue that the more a video evokes expressions of positive emotions, the more people feel happy, which means they want to be with the stimuli and thus like it more.

What is next?

Our study is just one of many more recent applications of automated facial coding of emotions. Google Scholar will return a mass of additional examples in educational, social or consumer psychology. We believe that scholars from the humanities should be encouraged to find merits to them in the method that we presented above. Computers can provide them with pieces of the puzzles they are trying to solve. Did Dutch 17th-century masters portray people as experiencing more negative or positive emotions than their predecessors in the Renaissance period did? On what mixture of emotions is Mona Lisa’s mysterious expression based? Example answers may be found in the pictures below. Is Rembrandt a bit angry or sad and Mona Lisa almost indifferent, though not completely so? Of course the “tentative final answer” is in the hands of the interpreting humanist, but they can at least delegate some of the manual labor to a well-trained machine.

rembrandt analyzed

mona lisa analyzed

 

 

If you are interested in reading further, we refer you to:

Peter Lewinski

 

Peter Lewinski is a Research Fellow (Marie Curie) at University of Amsterdam and an artificial intelligence firm – VicarVision. His work focuses on facial coding, advertising and consumer behavior. See www.uva.nl/profiel/p.lewinski.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s