Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Mimicry in the Recognition of Emotional Facial Expressions

Abstract

Facial expressions signal emotions and influence social interactions. One mechanism hypothesized to support the recognition of facial expressions is sensorimotor simulation—the observer simulates the observed expression internally and this affords a first-person, experiential understanding of how the target feels. Given enough sensorimotor simulation, this internal activity can be expressed externally in facial mimicry. Numerous studies have found that interfering with mimicry interferes with emotion recognition, particularly when decoding subtle expressions. This implies that mimicry reflects a form of computation that facilitates recognition when needed.

The Embodied Computation model of mimicry hypothesizes that simulation helps recognition by compensating visual mechanisms when visual emotion information is sparse— the more challenging an expression is to decode, the more simulation is involved in decoding it. It makes the unintuitive prediction that more mimicry will occur when emotion evidence is less available, but only if decoding the emotion is necessary. The Motor-Matching model of mimicry hypothesizes that mimicry is based on an automatic action-perception link (Chartrand & Bargh, 1999; Hess & Fischer, 2013). It makes the prediction that mimicry will reflect the emotion evidence: the more evidence, the more mimicry. If recognition is required, this will only increase attention and amplify the overall mimicry. The Emotional Mimicry in Context model hypothesizes that mimicry is not necessarily based on the amount of emotion evidence that is seen but whether or not the signal is interpreted to promote affiliation (Hess & Fischer, 2014). It predicts that the affiliative meaning of the observed expression determines whether or not mimicry occurs.

These hypotheses were tested in three experiments measuring EMG elicited by emotional faces in various challenging conditions. Results are argued to support a novel proposal that combines the insights of the embodied computation and emotional mimicry in context models.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View