Using a complex multi-modal on-body sensor system for activity spotting

Abstract
This paper describes an approach to real-life task tracking using a multi-modal, on-body sensor system. The specific example that we study is quality inspection in car production. This task is composed of up to 20 activity classes such as checking gaps between parts of the chassis, opening and closing the hood and trunk, moving the driver's seat, and turning the steering wheel. Most of these involve subtle and short movements and have a high degree of variability in the way they are performed. To nonetheless spot those actions in a continuous data stream we use a wearable system composed of 7 motion sensors, 16 force sensing resistors (FSR) for lower arm muscle monitoring and 4 ultra-wide band (UWB) tags for tracking user position. We propose a recognition approach that deals separately with each activity class and then merges the results in a final reasoning step. This allows us to fine-tune the system parameters separately for each activity. It also means that the system can be easily extended to accommodate further activities. To demonstrate the feasibility of our approach we present the results of a study with 8 participants and a total of 2394 activities.

This publication has 10 references indexed in Scilit: