Nonverbal communication with a humanoid robot via head gestures
- 1 July 2015
- conference paper
- conference paper
- Published by Association for Computing Machinery (ACM) in Proceedings of the 8th ACM International Conference on Bioinformatics, Computational Biology,and Health Informatics
Abstract
Social interactive robots require sophisticated perception and cognition abilities to behave and interact in a natural human-like way. The proper perception of behavior of interaction partner plays a crucial role in social robotics. The interpretation of these behaviors and mapping them to their exact meanings is also an important aspect that interactive robots should have. This paper proposes an interaction model for communicating verbally and nonverbally with human. Human behavior, during the interaction with the robot, is perceived and then interpreted depending on the situation in which the behavior has been detected. In this model, head gestures are used as a back channel (feedback) for the robot to adapt the interaction scenario. The back channel signals can be consciously or unconsciously generated by human. Simultaneously, the eye gazes are also detected to ensure right interpretation of head gestures. In order to recognize the human head gestures, head poses have been tracked over time. A stream of images with their corresponding depth information, acquired from a Kinect sensor, are used to find, track, and estimate the head poses of human. The proposed model has been tested in various experiments with different scenarios in interaction with human.Keywords
Funding Information
- DAAD, Germany
- Ministry of Higher Education and Scientific Research of Iraq
This publication has 14 references indexed in Scilit:
- Robust Head Gestures Recognition for Assistive TechnologyLecture Notes in Computer Science, 2014
- Multi-Initialized States Referred Work Parameter Calibration for Gaze Tracking Human-Robot InteractionInternational Journal of Advanced Robotic Systems, 2012
- Did I Get It Right: Head Gestures Analysis for Human-Machine InteractionsLecture Notes in Computer Science, 2009
- Visual-Based Emotion Detection for Natural Man-Machine InteractionLecture Notes in Computer Science, 2008
- Head gestures for perceptual interfaces: The role of context in improving recognitionArtificial Intelligence, 2007
- A tutorial on support vector regressionStatistics and Computing, 2004
- Robust Real-Time Face DetectionInternational Journal of Computer Vision, 2004
- Detecting faces in images: A surveyIeee Transactions On Pattern Analysis and Machine Intelligence, 2001
- Face Detection: A SurveyComputer Vision and Image Understanding, 2001
- Head movement during listening turns in conversationJournal of Nonverbal Behavior, 1985