Nonverbal communication with a humanoid robot via head gestures

Abstract
Social interactive robots require sophisticated perception and cognition abilities to behave and interact in a natural human-like way. The proper perception of behavior of interaction partner plays a crucial role in social robotics. The interpretation of these behaviors and mapping them to their exact meanings is also an important aspect that interactive robots should have. This paper proposes an interaction model for communicating verbally and nonverbally with human. Human behavior, during the interaction with the robot, is perceived and then interpreted depending on the situation in which the behavior has been detected. In this model, head gestures are used as a back channel (feedback) for the robot to adapt the interaction scenario. The back channel signals can be consciously or unconsciously generated by human. Simultaneously, the eye gazes are also detected to ensure right interpretation of head gestures. In order to recognize the human head gestures, head poses have been tracked over time. A stream of images with their corresponding depth information, acquired from a Kinect sensor, are used to find, track, and estimate the head poses of human. The proposed model has been tested in various experiments with different scenarios in interaction with human.
Funding Information
  • DAAD, Germany
  • Ministry of Higher Education and Scientific Research of Iraq

This publication has 14 references indexed in Scilit: