Tailored Reality: Perception-aware Scene Restructuring for Adaptive VR Navigation

Abstract
In virtual reality (VR), the virtual scenes are pre-designed by creators. Our physical surroundings, however, comprise significantly varied sizes, layouts, and components. To bridge the gap and further enable natural navigation, recent solutions have been proposed to redirect users or recreate the virtual content. However, they suffer from either interrupted experience or distorted appearances. We present a novel VR-oriented algorithm that automatically restructures a given virtual scene for a user’s physical environment. Different from the previous methods, we introduce neither interrupted walking experience nor curved appearances. Instead, a perception-aware function optimizes our retargeting technique to preserve the fidelity of the virtual scene that appears in VR head-mounted displays. Besides geometric and topological properties, it emphasizes the unique first-person view perceptual factors in VR, such as dynamic visibility and objectwise relationships. We conduct both analytical experiments and subjective studies. The results demonstrate our system’s versatile capability and practicability for natural navigation in VR: It reduces the virtual space by 40% without statistical loss of perceptual identicality.
Funding Information
  • National Natural Science Foundation of China (61802359 and 62025207)
  • Zhejiang Lab (2019NB0AB03)
  • USTC Research Funds of Double First-Class Initiative (YD0010002003)

This publication has 45 references indexed in Scilit: