Multi-modal data fusion for pain intensity assessment and classification

Abstract
In this work, an assessment of several fusion architectures is undertaken within the scope of the development of a pain intensity classification system. The assessment is based on the recently recorded SenseEmotion Database [1], which consists of several individuals subjected to three gradually increasing levels of pain intensity, induced through temperature elevation (heat stimulation) under controlled conditions. Several modalities, including audio, video, respiration, electrocardiography, electromyography and electrodermal activity, were synchronously recorded during the experiments. A broad spectrum of descriptors is extracted from each of the involved modalities, followed by an assessment of the combination of the extracted descriptors through several fusion architectures. Experimental validation suggests that the choice of an appropriate fusion architecture, which is able to significantly improve over the performance of the best single modality, mainly depends on the amount of data available for the training of the classification architecture.