IEMOCAP: interactive emotional dyadic motion capture database
Top Cited Papers
- 5 November 2008
- journal article
- Published by Springer Science and Business Media LLC in Language Resources and Evaluation
- Vol. 42 (4), 335-359
- https://doi.org/10.1007/s10579-008-9076-6
Abstract
No abstract availableKeywords
This publication has 35 references indexed in Scilit:
- Facial actions as visual cues for personalityComputer Animation and Virtual Worlds, 2006
- Natural head motion synthesis driven by acoustic prosodic featuresComputer Animation and Virtual Worlds, 2005
- Beyond emotion archetypes: Databases for emotion modelling using neural networksNeural Networks, 2005
- Challenges in real-life emotion annotation and machine learning based detectionNeural Networks, 2005
- Emotional speech: Towards a new generation of databasesSpeech Communication, 2003
- Describing the emotional states that are expressed in speechSpeech Communication, 2003
- Emotion recognition in human-computer interactionIEEE Signal Processing Magazine, 2001
- Face and 2-D mesh animation in MPEG-4Signal Processing: Image Communication, 2000
- The SPHINX-II speech recognition system: an overviewComputer Speech & Language, 1993
- Coefficient alpha and the internal structure of testsPsychometrika, 1951