From insect vision to robot vision

Abstract
Airborne insects are miniature wing-flapping aircraft the visually guided manoeuvres of which depend on analogue, `fly-by-wire' controls. The front-end of their visuomotor system consists of a pair of compound eyes which are masterpieces of integrated optics and neural design. They rely on an array of passive sensors driving an orderly analogue neural network. We explored in concrete terms how motion-detecting neurons might possibly be used to solve navigational tasks involving obstacle avoidance in a creature whose wings are exquisitely guided by eyes with a poor spatial resolution. We designed, simulated, and built a complete terrestrial creature which moves about and avoids obstacles solely by evaluating the relative motion between itself and the environment. The compound eye uses an array of elementary motion detectors (EMDS) as smart, passive ranging sensors. Like its physiological counterpart, the visuomotor system is based on analogue, continuous-time processing and does not make use of conventional computers. It uses hardly any memory to adjust the robot's heading in real time via a local and intermittent visuomotor feedback loop. This paper shows that the understanding of some invertebrate sensory-motor systems has now reached a level able to provide valuable design hints. Our approach brings into prominence the mutual constraints in the designs of a sensory and a motor system, in both living and non-living ambulatory creatures.

This publication has 12 references indexed in Scilit: