Tuning the developing brain to social signals of emotions

Abstract
Most humans develop a capacity to recognize the emotional signals of different facial expressions. This capacity is mediated by a brain network that involves emotion-related brain circuits (including the amygdala and the orbitofrontal cortex) and higher-level visual representation areas (the fusiform gyrus and the superior temporal sulcus). Human infants start to discriminate facial expressions by the second half of their first year. At approximately the same age, they begin to exhibit adult-like attentional bias towards certain salient facial expressions (such as expressions of fear), as well as enhanced vision- and attention-related event-related brain potentials to fearful facial expressions. These findings, together with experimental lesion and anatomical tracing studies in other species, suggest that the key components of the emotion-processing network and their interconnections are established and become functional early in postnatal life. Developmental studies in humans and monkeys have further shown that face-processing mechanisms are initially broadly tuned (activated by a broad range of stimuli, such as human and monkey faces) but narrow and become more specialized for specific types of perceptual discriminations with experience (discrimination of information in human faces). Collectively, these data suggest that the acquisition of representations of facial expressions may reflect the functional emergence of an experience-expectant mechanism by the second half of the first year and rapid experience-driven attunement of this mechanism to species-typical facial expressions. Specifically, components of the emotion-processing network (such as the amygdala) may be to a limited extent prepared for processing and storing information about biologically salient cues, but they require exposure to facial expressions at a specific developmental time point ('expected experience') in order to be refined and develop towards more mature forms. Although the basic organization of the emotion-recognition networks is specified by an experience-expectant neural circuitry that emerges during a sensitive period, representations of facial expressions are likely to be continually fine-tuned by individual-specific experiences (reflecting the experience-dependent development of facial-expression processing). Genetically driven differences in the reactivity of emotion-related brain circuits (for example, polymorphisms that affect serotonin transmission and amygdala reactivity) might, in combination with environmental factors (such as exposure to negative emotions), bias the developmental process towards heightened sensitivity to signals of certain negative emotions.