Human Joint Angle Estimation with Inertial Sensors and Validation with A Robot Arm

Abstract
Traditionally, human movement has been captured primarily by motion capture systems. These systems are costly, require fixed cameras in a controlled environment, and suffer from occlusion. Recently, the availability of low-cost wearable inertial sensors containing accelerometers, gyroscopes, and magnetometers have provided an alternative means to overcome the limitations of motion capture systems. Wearable inertial sensors can be used anywhere, cannot be occluded, and are low cost. Several groups have described algorithms for tracking human joint angles. We previously described a novel approach based on a kinematic arm model and the Unscented Kalman Filter (UKF). Our proposed method used a minimal sensor configuration with one sensor on each segment. This paper reports significant improvements in both the algorithm and the assessment. The new model incorporates gyroscope and accelerometer random drift models, imposes physical constraints on the range of motion for each joint, and uses zero-velocity updates to mitigate the effect of sensor drift. A high-precision industrial robot arm precisely quantifies the performance of the tracker during slow, normal, and fast movements over continuous 15-min recording durations. The agreement between the estimated angles from our algorithm and the high-precision robot arm reference was excellent. On average, the tracker attained an RMS angle error of about 3(°) for all six angles. The UKF performed slightly better than the more common Extended Kalman Filter.

This publication has 32 references indexed in Scilit: