Embedding the 2D interaction metaphor in a real 3D virtual environment

Abstract
Recent advances in both rendering algorithms and hardware has brought virtual reality to the threshold of being able to model realistically complex environments, e.g., the mockup of a large structure. As impressive as these advances have been there is still little that a user can do within a VR system other than look--and react. If VR is to be usable in a design setting users will need to be able to interact, to control and modify their virtual environments using virtual tools inside that environment. In this paper we describe a realistic virtual computer/personal digital assistant that we have built. To the user this virtual computer appears as a hand held flat panel display. Input can be provided to this virtual computer using a virtual finger or stylus to `touch' the screen. We migrate applications developed for a flat screen environment into the virtual environment without modifying the application code. A major strength of this approach is that we meld the naturally 3D interaction metaphor of a hand held virtual tool with the software support provided by some 2D user interface toolkits. Our approach has provided us great flexibility in both designing and implementing user interface components for VR environments, and it has enabled us to represent the familiar flat screen human computer interaction metaphor within the VR context. We will describe some applications which made use of our capability.