Objects distance measurement in augmented reality for providing better user experience
Open Access
- 21 January 2021
- journal article
- research article
- Published by IOP Publishing in IOP Conference Series: Materials Science and Engineering
- Vol. 1032 (1), 012020
- https://doi.org/10.1088/1757-899x/1032/1/012020
Abstract
In this paper we present an algorithm for providing better user experience in AR applications based on HMD (Head Mounted Display), pair of cameras and Deep Learning semantic segmentation. The user can view the scene, captured by the cameras using HMD along with augmented information regarding distances to all detected objects. The cameras are attached to the HMD and must be located as close as possible each other. Moreover one of the cameras is shifted at a known distance closer to the scene with respect to the other camera. Based on the traditional pinhole camera model and estimating the size of the projection of a given object in pixel coordinate systems in both of the cameras, we are able to calculate the distance from the cameras to the object. For location and calculation of the size of projected objects we used a semantic segmentation based on deep learning algorithm.This publication has 4 references indexed in Scilit:
- Perceptual Limits of Optical See-Through Visors for Augmented Reality Guidance of Manual TasksIEEE Transactions on Biomedical Engineering, 2019
- Using an Augmented Reality Device as a Distance-based Vision Aid—Promise and LimitationsOptometry and Vision Science, 2018
- InfoSPOT: A mobile Augmented Reality method for accessing building information through a situation awareness approachAutomation in Construction, 2013
- Happy MeasureInternational Journal of Mobile Human Computer Interaction, 2013