Abstract

Human motion tracking is a fundamental building block for various applications including computer animation, human-computer interaction, healthcare, and so on. To reduce the burden of wearing multiple sensors, human motion prediction from sparse sensor inputs has become a hot topic in human motion tracking. However, such predictions are non-trivial as (i) the widely adopted data-driven approaches can easily collapse to average poses, and (ii) the predicted motions contain unnatural jitters. In this work, we address the aforementioned issues by proposing a novel framework which can accurately predict the human joint moving angles from the signals of only four flexible sensors, thereby achieving the tracking of human joints in multi-degrees of freedom. Specifically, we mitigate the collapse to average poses by implementing the model with a Bi-LSTM neural network that makes full use of short-time sequence information; we reduce jitters by adding a median pooling layer to the network, which smooths consecutive motions. Although being bio-compatible and ideal for improving the wearing experience, the flexible sensors are prone to aging which increases prediction errors. Observing that the aging of flexible sensors usually results in drifts of their resistance ranges, we further propose a novel dynamic calibration technique to rescale sensor ranges, which further improves the prediction accuracy. Experimental results show that our method achieves a low and stable tracking error of 4.51 degrees across different motion types with only four sensors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call