Wide area camera calibration using virtual calibration objects

Abstract
The paper introduces a method to calibrate a wide area system of unsynchronized cameras with respect to a single global coordinate system. The method is simple and does not require the physical construction of a large calibration object. The user need only wave an identifiable point in front of all cameras. The method generates a rough estimate of camera pose by first performing pair-wise structure-from-motion on observed points, and then combining the pair-wise registrations into a single coordinate frame. Using the initial camera pose, the moving point can be tracked in world space. The path of the point defines a "virtual calibration object" which can be used to improve the initial estimates of camera pose. Iterating the above process yields a more precise estimate of both camera pose and the point path. Experimental results show that it performs as well as calibration from a physical target, in cases where all cameras share some common working volume. We then demonstrate its effectiveness in wide area settings by calibrating a system of cameras in a configuration where traditional methods cannot be applied directly.

This publication has 8 references indexed in Scilit: