A computational account of the mechanisms underlying face perception biases in depression.

Abstract
Here, we take a computational approach to understand the mechanisms underlying face perception biases in depression. Thirty participants diagnosed with major depressive disorder and 30 healthy control participants took part in three studies involving recognition of identity and emotion in faces. We used signal detection theory to determine whether any perceptual biases exist in depression aside from decisional biases. We found lower sensitivity to happiness in general, and lower sensitivity to both happiness and sadness with ambiguous stimuli. Our use of highly-controlled face stimuli ensures that such asymmetry is truly perceptual in nature, rather than the result of studying expressions with inherently different discriminability. We found no systematic effect of depression on the perceptual interactions between face expression and identity. We also found that decisional strategies used in our task were different for people with depression and controls, but in a way that was highly specific to the stimulus set presented. We show through simulation that the observed perceptual effects, as well as other biases found in the literature, can be explained by a computational model in which channels encoding positive expressions are selectively suppressed.
Funding Information
  • National Institute of Mental Health (R21MH112013)