Abstract

Wearable inertial motion capture, a new type of motion capture technology, mainly estimates the human posture in 3-D space through multisensor data fusion. The available method for sensor fusion is usually aided by magnetometers to remove the drift error in yaw angle estimation, which in turn limits their application in the presence of a complex magnetic field environment. In this article, an extended Kalman filter (EKF) data fusion method is proposed to fuse the 9-axis sensor data. Meanwhile, the heuristic drift reduction (HDR) method is used to calibrate the accumulated error of a heading angle. In addition, the position in 3-D space is estimated by the foot-mounted zero-velocity-update (ZUPT) technique. Combining 3-D attitude and position, a biomechanical model of the human body is established to track the motion of a real human body. The EKF algorithm and position estimation methods are benchmarked against the golden standard, optical motion capture system, for various indoor experiments. In addition, various outdoor experiments are also conducted to verify the reliability of the proposed method. The results show that the proposed algorithm outperforms the available attitude estimation model in motion tracking and is feasible for 3-D human motion capture.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call