A Dynamic Appearance Descriptor Approach to Facial Actions Temporal Modeling
- 19 April 2013
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Cybernetics
- Vol. 44 (2), 161-174
- https://doi.org/10.1109/tcyb.2013.2249063
Abstract
Both the configuration and the dynamics of facial expressions are crucial for the interpretation of human facial behavior. Yet to date, the vast majority of reported efforts in the field either do not take the dynamics of facial expressions into account, or focus only on prototypic facial expressions of six basic emotions. Facial dynamics can be explicitly analyzed by detecting the constituent temporal segments in Facial Action Coding System (FACS) Action Units (AUs)-onset, apex, and offset. In this paper, we present a novel approach to explicit analysis of temporal dynamics of facial actions using the dynamic appearance descriptor Local Phase Quantization from Three Orthogonal Planes (LPQ-TOP). Temporal segments are detected by combining a discriminative classifier for detecting the temporal segments on a frame-by-frame basis with Markov Models that enforce temporal consistency over the whole episode. The system is evaluated in detail over the MMI facial expression database, the UNBC-McMaster pain database, the SAL database, the GEMEP-FERA dataset in database-dependent experiments, in cross-database experiments using the Cohn-Kanade, and the SEMAINE databases. The comparison with other state-of-the-art methods shows that the proposed LPQ-TOP method outperforms the other approaches for the problem of AU temporal segment detection, and that overall AU activation detection benefits from dynamic appearance information.Keywords
This publication has 42 references indexed in Scilit:
- Recognition of 3D facial expression dynamicsImage and Vision Computing, 2012
- Painful monitoring: Automatic pain monitoring using the UNBC-McMaster shoulder pain expression archive databaseImage and Vision Computing, 2012
- Regression-based intensity estimation of facial action unitsImage and Vision Computing, 2011
- Automatic, Dimensional and Continuous Emotion RecognitionInternational Journal of Synthetic Emotions, 2010
- Machine analysis of facial behaviour: naturalistic and dynamic behaviourPhilosophical Transactions B, 2009
- Facial expression recognition based on Local Binary Patterns: A comprehensive studyImage and Vision Computing, 2009
- Deciphering the Enigmatic FacePsychological Science, 2005
- THE TIMING OF FACIAL MOTION IN POSED AND SPONTANEOUS SMILESInternational Journal of Wavelets, Multiresolution and Information Processing, 2004
- Recognizing action units for facial expression analysisIEEE Transactions on Pattern Analysis and Machine Intelligence, 2001
- Example-based learning for view-based human face detectionIEEE Transactions on Pattern Analysis and Machine Intelligence, 1998