Wireless steerable vision for live insects and insect-scale robots
- 15 July 2020
- journal article
- research article
- Published by American Association for the Advancement of Science (AAAS) in Science Robotics
- Vol. 5 (44)
- https://doi.org/10.1126/scirobotics.abb0839
Abstract
Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion. By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot. Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams “first person” 160 pixels–by–120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away. We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10–milliamp hour battery. We also built a small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes. Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.Keywords
Funding Information
- National Science Foundation
- Microsoft Research
This publication has 41 references indexed in Scilit:
- The fine structure of honeybee head and body yaw movements in a homing taskProceedings. Biological sciences, 2010
- Robotic versus manual control in magnetic steering of an endoscopic capsuleEndoscopy, 2009
- Design, Aerodynamics, and Vision-Based Control of the DelFlyInternational Journal of Micro Air Vehicles, 2009
- Mechanism of phototaxis in marine zooplanktonNature, 2008
- Ommatidia of blow fly, house fly, and flesh fly: implication of their vision efficiencyZeitschrift für Parasitenkunde, 2008
- Dynamics, Design and Simulation of a Novel Microrobotic Platform Employing Vibration MicroactuatorsJournal of Dynamic Systems, Measurement, and Control, 2005
- Side-to-side head movements to obtain motion depth cues:Behavioural Processes, 1998
- Compensatory head roll in the blowflyCalliphoraduring flightProceedings of the Royal Society of London. B. Biological Sciences, 1986
- The flicker fusion frequencies of six laboratory insects, and the response of the compound eye to mains fluorescent ‘ripple’Physiological Entomology, 1978
- Dynamics of Motion Perception in the Desert LocustScience, 1964