Indoor Pedestrian Localization With a Smartphone: A Comparison of Inertial and Vision-Based Methods

Abstract
Indoor pedestrian navigation systems are increasingly needed in various types of applications. However, such systems are still face many challenges. In addition to being accurate, a pedestrian positioning system must be mobile, cheap, and lightweight. Many approaches have been explored. In this paper, we take the advantage of sensors integrated in a smartphone and their capabilities to develop and compare two low-cost, hands-free, and handheld indoor navigation systems. The first one relies on embedded vision (smartphone camera), while the second option is based on low-cost smartphone inertial sensors (magnetometer, accelerometer, and gyroscope) to provide a relative position of the pedestrian. The two associated algorithms are computationally lightweight, since their implementations take into account the restricted resources of the smartphone. In the experiment conducted, we evaluate and compare the accuracy and repeatability of the two positioning methods for different indoor paths. The results obtained demonstrate that the vision-based localization system outperforms the inertial sensor-based positioning system.
Funding Information
  • Région Centre through the AZIMUT Project

This publication has 44 references indexed in Scilit: