Audiovisual time perception is spatially specific
Open Access
- 25 February 2012
- journal article
- research article
- Published by Springer Science and Business Media LLC in Experimental Brain Research
- Vol. 218 (3), 477-485
- https://doi.org/10.1007/s00221-012-3038-3
Abstract
Our sensory systems face a daily barrage of auditory and visual signals whose arrival times form a wide range of audiovisual asynchronies. These temporal relationships constitute an important metric for the nervous system when surmising which signals originate from common external events. Internal consistency is known to be aided by sensory adaptation: repeated exposure to consistent asynchrony brings perceived arrival times closer to simultaneity. However, given the diverse nature of our audiovisual environment, functionally useful adaptation would need to be constrained to signals that were generated together. In the current study, we investigate the role of two potential constraining factors: spatial and contextual correspondence. By employing an experimental design that allows independent control of both factors, we show that observers are able to simultaneously adapt to two opposing temporal relationships, provided they are segregated in space. No such recalibration was observed when spatial segregation was replaced by contextual stimulus features (in this case, pitch and spatial frequency). These effects provide support for dedicated asynchrony mechanisms that interact with spatially selective mechanisms early in visual and auditory sensory pathways.Keywords
This publication has 41 references indexed in Scilit:
- Asynchrony adaptation reveals neural population code for audio-visual timingProceedings Of The Royal Society B-Biological Sciences, 2010
- Temporal recalibration of visionProceedings Of The Royal Society B-Biological Sciences, 2010
- Attention regulates the plasticity of multisensory timingEuropean Journal of Neuroscience, 2010
- Audio-Visual Speech Cue CombinationPLOS ONE, 2010
- The optimal time window of visual–auditory integration: a reaction time analysisFrontiers in Integrative Neuroscience, 2010
- Effect before Cause: Supramodal Recalibration of Sensorimotor TimingPLOS ONE, 2009
- Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of eventsBrain Research Bulletin, 2008
- Causal Inference in Multisensory PerceptionPLOS ONE, 2007
- No effect of auditory–visual spatial disparity on temporal recalibrationExperimental Brain Research, 2007
- Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integrationProceedings Of The Royal Society B-Biological Sciences, 2006