Efficiency of Sensory Substitution Devices Alone and in Combination With Self-Motion for Spatial Navigation in Sighted and Visually Impaired
Open Access
- 10 July 2020
- journal article
- research article
- Published by Frontiers Media SA in Frontiers in Psychology
- Vol. 11, 1443
- https://doi.org/10.3389/fpsyg.2020.01443
Abstract
Human adults can optimally combine vision with self-motion to facilitate navigation. In the absence of visual input (e.g., dark environments and visual impairments), sensory substitution devices (SSDs), such as The vOICe or BrainPort, which translate visual information into auditory or tactile information, could be used to increase navigation precision when integrated together or with self-motion. In Experiment 1, we compared and assessed together The vOICe and BrainPort in aerial maps task performed by a group of sighted participants. In Experiment 2, we examined whether sighted individuals and a group of visually impaired (VI) individuals could benefit from using The vOICe, with and without self-motion, to accurately navigate a three-dimensional (3D) environment. In both studies, 3D motion tracking data were used to determine the level of precision with which participants performed two different tasks (an egocentric and an allocentric task) and three different conditions (two unisensory conditions and one multisensory condition). In Experiment 1, we found no benefit of using the devices together. In Experiment 2, the sighted performance during The vOICe was almost as good as that for self-motion despite a short training period, although we found no benefit (reduction in variability) of using The vOICe and self-motion in combination compared to the two in isolation. In contrast, the group of VI participants did benefit from combining The vOICe and self-motion despite the low number of trials. Finally, while both groups became more accurate in their use of The vOICe with increased trials, only the VI group showed an increased level of accuracy in the combined condition. Our findings highlight how exploiting non-visual multisensory integration to develop new assistive technologies could be key to help blind and VI persons, especially due to their difficulty in attaining allocentric information.This publication has 59 references indexed in Scilit:
- How well do you see what you hear? The acuity of visual-to-auditory sensory substitutionFrontiers in Psychology, 2013
- Neural correlates of olfactory processing in congenital blindnessNeuropsychologia, 2011
- Navigation with a sensory substitution device in congenitally blind individualsNeuroReport, 2011
- Visual influence on path integration in darkness indicates a multimodal representation of large-scale spaceProceedings of the National Academy of Sciences of the United States of America, 2011
- Neural correlates of virtual route recognition in congenital blindnessProceedings of the National Academy of Sciences of the United States of America, 2010
- Visual experiences in the blind induced by an auditory sensory substitution deviceConsciousness and Cognition, 2010
- The neural basis of multisensory integration in the midbrain: Its organization and maturationHearing Research, 2009
- Humans integrate visual and haptic information in a statistically optimal fashionNature, 2002
- A Fast Algorithm for the Minimum Covariance Determinant EstimatorTechnometrics, 1999
- Early-Blind Subjects' Spatial Abilities in the Locomotor Space: Exploratory Strategies and Reaction-to-Change PerformancePerception, 1996