A System for Microscope-Assisted Guided Interventions

Abstract
We present a system for surgical navigation using stereo overlays in the operating microscope aligned to the operative scene. This augmented reality system provides 3D information about nearby structures and offers a significant advancement over pointer-based guidance, which provides only the location of one point and requires the surgeon to look away from the operative scene. With a previous version of this system, we demonstrated feasibility, but it became clear that to achieve convincing guidance through the magnified microscope view, a very high alignment accuracy was required. We have made progress with several aspects of the system, including automated calibration, error simulation, bone-implanted fiducials and a dental attachment for tracking. We have performed experiments to establish the visual display parameters required to perceive overlaid structures beneath the operative surface. Easy perception of real and virtual structures with the correct transparency has been demonstrated in a laboratory and through the microscope. The result is a system with a predicted accuracy of 0.9 mm and phantom errors of 0.5 mm. In clinical practice errors are 0.5–1.5 mm, rising to 2–4 mm when brain deformation occurs.

This publication has 2 references indexed in Scilit: