3D hand recognition for telerobotics

Abstract
This paper presents a system for recognition of 3D hand gestures for the purpose controlling and manipulating robots. This objective of the work is to allow the robot to mimic or imitate the recognized gesture which can be used for remote manipulation of a robotic arm to perform complex task (teleoperation). Telerobotics systems rely on computer vision to create the human-machine interface. In this project, hand tracking was used as an intuitive control interface because it represents a natural interaction medium. The system tracks the hand of the operator and the gesture it represents, and relays the appropriate signal to the robot to perform the respective action in real time. The study focuses on two gestures, open hand, and closed hand, as the NAO robot is not equipped with a dexterous hand. SURF features points have been used to represent the hand gesture and face to hand distance was used to gauge the depth of the hand. This system has been test with Aldebaran NAO robot for performing different gesture imitation task for picking and placing objects.

This publication has 11 references indexed in Scilit: