Multimodal information fusion application to human emotion recognition from face and speech
Top Cited Papers
- 20 August 2009
- journal article
- Published by Springer Science and Business Media LLC in Multimedia Tools and Applications
- Vol. 49 (2), 277-297
- https://doi.org/10.1007/s11042-009-0344-2
Abstract
No abstract availableKeywords
This publication has 26 references indexed in Scilit:
- A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous ExpressionsIEEE Transactions on Pattern Analysis and Machine Intelligence, 2008
- Interrelation Between Speech and Facial Gestures in Emotional Utterances: A Single Subject StudyIEEE Transactions on Audio, Speech, and Language Processing, 2007
- How emotion is made and measuredInternational Journal of Human-Computer Studies, 2007
- Bimodal emotion recognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Emotion recognition in human-computer interactionIEEE Signal Processing Magazine, 2001
- Audio-visual speech modeling for continuous speech recognitionIEEE Transactions on Multimedia, 2000
- Efficient region tracking with parametric models of geometry and illuminationIEEE Transactions on Pattern Analysis and Machine Intelligence, 1998
- Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image MotionInternational Journal of Computer Vision, 1997
- Facial expression and emotion.American Psychologist, 1993
- Emotion recognition: The role of facial movement and the relative importance of upper and lower areas of the face.Journal of Personality and Social Psychology, 1979