Speech based emotion classification framework for driver assistance system

Abstract
Automated analysis of human affective behavior has attracted increasing attention in recent years. Driver's emotion often influences driving performance which can be improved if the car actively responds to the emotional state of the driver. It is important for an intelligent driver support system to accurately monitor the driver's state in an unobtrusive and robust manner. Ever changing environment while driving poses a serious challenge to existing techniques for speech emotion recognition. In this paper, we utilize contextual information of the outside environment as well as inside car user to improve the emotion recognition accuracy. In particular, a noise cancellation technique is used to suppress the noise adaptively based on the driving context and a gender based context information is analyzed for developing the classifier. Experimental analyses show promising results.

This publication has 14 references indexed in Scilit: