Diagnostic Features of Emotional Expressions Are Processed Preferentially
Open Access
- 25 July 2012
- journal article
- research article
- Published by Public Library of Science (PLoS) in PLOS ONE
- Vol. 7 (7), e41792
- https://doi.org/10.1371/journal.pone.0041792
Abstract
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.Keywords
This publication has 47 references indexed in Scilit:
- Different amygdala subregions mediate valence-related and attentional effects of oxytocin in humansProceedings of the National Academy of Sciences of the United States of America, 2010
- Exploration of core features of a human face by healthy and autistic adults analyzed by visual scanningNeuropsychologia, 2009
- Fear, faces, and the human amygdalaCurrent Opinion in Neurobiology, 2008
- Enhanced Processing of Threat Stimuli under Limited Attentional ResourcesCerebral Cortex, 2008
- Are you always on my mind? A review of how face perception and attention interactNeuropsychologia, 2007
- Event-related brain potential correlates of emotional face processingNeuropsychologia, 2007
- Processing the socially relevant parts of facesBrain Research Bulletin, 2007
- A mechanism for impaired fear recognition after amygdala damageNature, 2005
- Do the eyes have it? Cues to the direction of social attentionTrends in Cognitive Sciences, 2000
- Preattentive analysis of facial expressions of emotionCognition and Emotion, 1995