Painful data: The UNBC-McMaster shoulder pain expression archive database

Abstract
A major factor hindering the deployment of a fully functional automatic facial expression detection system is the lack of representative data. A solution to this is to narrow the context of the target application, so enough data is available to build robust models so high performance can be gained. Automatic pain detection from a patient's face represents one such application. To facilitate this work, researchers at McMaster University and University of Northern British Columbia captured video of participant's faces (who were suffering from shoulder pain) while they were performing a series of active and passive range-of-motion tests to their affected and unaffected limbs on two separate occasions. Each frame of this data was AU coded by certified FACS coders, and self-report and observer measures at the sequence level were taken as well. This database is called the UNBC-McMaster Shoulder Pain Expression Archive Database. To promote and facilitate research into pain and augment current datasets, we have publicly made available a portion of this database which includes: (1) 200 video sequences containing spontaneous facial expressions, (2) 48,398 FACS coded frames, (3) associated pain frame-by-frame scores and sequence-level self-report and observer measures, and (4) 66-point AAM landmarks. This paper documents this data distribution in addition to describing baseline results of our AAM/SVM system. This data will be available for distribution in March 2011.