A model of how depth facilitates scene-relative object motion perception
Open Access
- 14 November 2019
- journal article
- research article
- Published by Public Library of Science (PLoS) in PLoS Computational Biology
- Vol. 15 (11), e1007397
- https://doi.org/10.1371/journal.pcbi.1007397
Abstract
Many everyday interactions with moving objects benefit from an accurate perception of their movement. Self-motion, however, complicates object motion perception because it generates a global pattern of motion on the observer’s retina and radically influences an object’s retinal motion. There is strong evidence that the brain compensates by suppressing the retinal motion due to self-motion, however, this requires estimates of depth relative to the object—otherwise the appropriate self-motion component to remove cannot be determined. The underlying neural mechanisms are unknown, but neurons in brain areas MT and MST may contribute given their sensitivity to motion parallax and depth through joint direction, speed, and disparity tuning. We developed a neural model to investigate whether cells in areas MT and MST with well-established neurophysiological properties can account for human object motion judgments during self-motion. We tested the model by comparing simulated object motion signals to human object motion judgments in environments with monocular, binocular, and ambiguous depth. Our simulations show how precise depth information, such as that from binocular disparity, may improve estimates of the retinal motion pattern due the self-motion through increased selectivity among units that respond to the global self-motion pattern. The enhanced self-motion estimates emerged from recurrent feedback connections in MST and allowed the model to better suppress the appropriate direction, speed, and disparity signals from the object’s retinal motion, improving the accuracy of the object’s movement direction represented by motion signals. Research has shown that the accuracy with which humans perceive object motion during self-motion improves in the presence of stereo cues. Using a neural modelling approach, we explore whether this finding can be explained through improved estimation of the retinal motion induced by self-motion. Our results show that depth cues that provide information about scene structure may have a large effect on the specificity with which the neural mechanisms for motion perception represent the visual self-motion signal. This in turn enables effective removal of the retinal motion due to self-motion when the goal is to perceive object motion relative to the stationary world. These results reveal a hitherto unknown critical function of stereo tuning in the MT-MST complex, and shed important light on how the brain may recruit signals from upstream and downstream brain areas to simultaneously perceive self-motion and object motion.Funding Information
- Office of Naval Research (N00014-14-1-0359)
- Office of Naval Research (N00014-18-1- 2283)
This publication has 48 references indexed in Scilit:
- Joint Representation of Depth from Motion Parallax and Binocular Disparity Cues in Macaque Area MTJournal of Neuroscience, 2013
- Visual and Non-Visual Contributions to the Perception of Object Motion during Self-MotionPLOS ONE, 2013
- Does optic flow parsing depend on prior estimation of heading?Journal of Vision, 2012
- Hierarchical processing of complex motion along the primate dorsal visual pathwayProceedings of the National Academy of Sciences of the United States of America, 2012
- Binocular Disparity Tuning and Visual–Vestibular Congruency of Multisensory Neurons in Macaque Parietal CortexJournal of Neuroscience, 2011
- A Comparison of Vestibular Spatiotemporal Tuning in Macaque Parietoinsular Vestibular Cortex, Ventral Intraparietal Area, and Medial Superior Temporal AreaJournal of Neuroscience, 2011
- Decoding of MSTd Population Activity Accounts for Variations in the Precision of Heading PerceptionNeuron, 2010
- Human V6: The Medial Motion AreaCerebral Cortex, 2009
- A neural representation of depth from motion parallax in macaque visual cortexNature, 2008
- MST Neurons Code for Visual Motion in Space Independent of Pursuit Eye MovementsJournal of Neurophysiology, 2007