Machine Understanding of Emotion and Sentiment

Abstract
Emotions are subjective experiences involving perceptual and con-textual factors [4]. There is no objective tool for precise measurement of emotions. However, we can anticipate an emotion's emergence through the knowledge of common responses to events in similar situations. We can also measure proxies of emotions by recognizing emotional expressions [3]. Studying emotional response to multimedia allows identifying expected emotions in users consuming the content. For example,abrupt loud voices are novel and unsettling which result in surprise and higher experience of arousal [2,6]. For a particular type of con-tent such as music, mid-level attributes such as rhythmic stability or melodiousness have strong association with expected emotions[1]. Given that such mid-level attributes are more related to the con-tent, their machine-perception is more straightforward. Moreover,their perception in combination with user models enables building person-specific emotion anticipation models.In addition to studying expected emotions, we can also observe users emotional reactions to understand emotion in multimedia.Typical methods of emotion recognition include recognizing emotions from facial or vocal expressions. Recognition of emotional expressions requires large amount of labeled data, expensive to produce. Hence, the most recent advances in machine-based emotion perception include methods that can leverage unlabeled data through self-supervised and semi-supervised learning [3, 5]. In this talk, I review the field and showcase methods for automatic modeling and recognition of emotions and sentiment indifferent contexts [3,8]. I show how we can identify underlying factors contributing to the construction of subjective experience of emotions [1,7]. Identification of these factors allows us to use them as mid-level attributes to build machine learning models for emotion and sentiment understanding. I also show how emotions and sentiment can be recognized from expressions with the goal of building empathetic autonomous agents [8].
Funding Information
  • Army Research Office (W911NF-20-2-0053)

This publication has 5 references indexed in Scilit: