Embodied hyperacuity from Bayesian perception: Shape and position discrimination with an iCub fingertip sensor

Abstract
Recent advances in modeling animal perception has motivated an approach of Bayesian perception applied to biomimetic robots. This study presents an initial application of Bayesian perception on an iCub fingertip sensor mounted on a dedicated positioning robot. We systematically probed the test system with five cylindrical stimuli offset by a range of positions relative to the fingertip. Testing the real-time speed and accuracy of shape and position discrimination, we achieved sub-millimeter accuracy with just a few taps. This result is apparently the first explicit demonstration of perceptual hyperacuity in robot touch, in that object positions are perceived more accurately than the taxel spacing. We also found substantial performance gains when the fingertip can reposition itself to avoid poor perceptual locations, which indicates that improved robot perception could mimic active perception in animals.

This publication has 21 references indexed in Scilit: