Abstract

In-field human motion capture (HMC) is drawing increasing attention due to the multitude of application areas. Plenty of research is currently invested in camera-based (markerless) HMC, with the advantage of no infrastructure being required on the body, and additional context information being available from the surroundings. However, the inherent drawbacks of camera-based approaches are the limited field of view and occlusions. In contrast, inertial HMC (IHMC) does not suffer from occlusions, thus being a promising approach for capturing human motion outside the laboratory. However, one major challenge of such methods is the necessity of spatial registration. Typically, during a predefined calibration sequence, the orientation and location of each inertial sensor are registered with respect to the underlying skeleton model. This work contributes to calibration-free IHMC, as it proposes a recursive estimator for the simultaneous online estimation of all sensor poses and joint positions of a kinematic chain model like the human skeleton. The full derivation from an optimization objective is provided. The approach can directly be applied to a synchronized data stream from a body-mounted inertial sensor network. Successful evaluations are demonstrated on noisy simulated data from a three-link chain, real lower-body walking data from 25 young, healthy persons, and walking data captured from a humanoid robot. The estimated and derived quantities, global and relative sensor orientations, joint positions, and segment lengths can be exploited for human motion analysis and anthropometric measurements, as well as in the context of hybrid markerless visual-inertial HMC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call