Haptic specification of environmental events: implications for the design of adaptive, virtual interfaces

Abstract
Future airborne crewstations are currently being designed that will incorporate multisensory virtual displays to convey operationally relevant information to crew members. In addition, these displays and associated controls will be designed to adapt to the changing psychological and physiological state of the user, and the tactical/environmental state of the external world. In support of this design goal, research is being conducted to explore the information extraction capabilities of the sensory modalities. Toward this end, an experiment was conducted to assess the degree to which force-reflective haptic stimulation can be used to provide individuals with information about their location and movement through space. Specifically, a force-reflecting, haptically-augmented aircraft control stick was designed and utilized with the goal of providing pilots with real-time information concerning lateral deviation (or "line-up") with respect to the runway in a simulated instrument landing task. Pilots executed simulated landing approaches with either the force-reflecting stick or a standard aircraft displacement stick under either calm or turbulent conditions. The results indicated a consistent advantage in performance and perceived workload for the force-reflecting stick, particularly under conditions of simulated turbulence. The results are discussed in terms of their relevance for the design of advanced airborne crewstations that utilize multisensory, adaptive, virtual interfaces.

This publication has 1 reference indexed in Scilit: