Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information
- 28 February 2013
- journal article
- Published by Elsevier BV in Image and Vision Computing
- Vol. 31 (2), 137-152
- https://doi.org/10.1016/j.imavis.2012.08.018
Abstract
No abstract availableKeywords
This publication has 26 references indexed in Scilit:
- Ranking-Based Emotion Recognition for Music Organization and RetrievalIEEE Transactions on Audio, Speech, and Language Processing, 2010
- Combining Long Short-Term Memory and Dynamic Bayesian Networks for Incremental Emotion-Sensitive Artificial ListeningIEEE Journal of Selected Topics in Signal Processing, 2010
- A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous ExpressionsIEEE Transactions on Pattern Analysis and Machine Intelligence, 2008
- Statistical mapping between articulatory movements and acoustic spectrum using a Gaussian mixture modelSpeech Communication, 2008
- Bi-modal emotion recognition from expressive face and body gesturesJournal of Network and Computer Applications, 2007
- Primitives-based evaluation and estimation of emotions in speechSpeech Communication, 2007
- Extracting moods from pictures and sounds: towards truly personalized TVIEEE Signal Processing Magazine, 2006
- Affective video content representation and modelingIEEE Transactions on Multimedia, 2005
- Evidence for a three-factor theory of emotionsJournal of Research in Personality, 1977
- Three dimensions of emotion.Psychological Review, 1954