Bi-modal emotion recognition from expressive face and body gestures
- 1 November 2007
- journal article
- Published by Elsevier BV in Journal of Network and Computer Applications
- Vol. 30 (4), 1334-1345
- https://doi.org/10.1016/j.jnca.2006.09.007
Abstract
Psychological research findings suggest that humans rely on the combined visual channels of face and body more than any other channel when they make judgments about human communicative behavior. However, most of the existing systems attempting to analyze the human nonverbal behavior are mono-modal and focus only on the face. Research that aims to integrate gestures as an expression mean has only recently emerged. Accordingly, this paper presents an approach to automatic visual recognition of expressive face and upper-body gestures from video sequences suitable for use in a vision-based affective multi-modal framework. Face and body movements are captured simultaneously using two separate cameras. For each video sequence single expressive frames both from face and body are selected manually for analysis and recognition of emotions. Firstly, individual classifiers are trained from individual modalities. Secondly, we fuse facial expression and affective body gesture information at the feature and at the decision level. In the experiments performed, the emotion classification using the two modalities achieved a better recognition accuracy outperforming classification using the individual facial or bodily modality alone.Keywords
This publication has 11 references indexed in Scilit:
- A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective BehaviorPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2006
- Emotion Analysis in Man-Machine Interaction SystemsLecture Notes in Computer Science, 2005
- Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint DependenceJournal of Nonverbal Behavior, 2004
- Probabilistic combination of multiple modalities to detect interestPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2004
- To feel or not to feel: The role of affect in human–computer interactionInternational Journal of Human-Computer Studies, 2003
- A theoretical study on six classifier fusion strategiesIeee Transactions On Pattern Analysis and Machine Intelligence, 2002
- Detecting faces in images: A surveyIeee Transactions On Pattern Analysis and Machine Intelligence, 2001
- Multimodal integration-a statistical viewIEEE Transactions on Multimedia, 1999
- A novel method for automatic face segmentation, facial feature extraction and trackingSignal Processing: Image Communication, 1998
- Thin slices of expressive behavior as predictors of interpersonal consequences: A meta-analysis.Psychological Bulletin, 1992