Abstract

Human motion capture is important for a wide variety of applications, e.g., biomechanical analysis, virtual reality and character animation. Current human motion capture solutions require a large number of markers/sensors to be placed on the body. In this work, it is shown that this can be reduced by using data-driven approaches. First a comparison of the use of lazy and eager learning methods for estimation of full-body movements from a minimal sensor set is done, which shows that both learning approaches lead to similar estimation accuracy. Next, improvements of the time coherency of output poses of the previously developed eager learning method are introduced by using a stacked input neural network. Results show that these deep and shallow learning approaches show comparable accuracy in estimation of full-body poses using only five inertial sensors. The developed approach is then applied to a running application, which shows that a subject-specific trained network estimates kinematics and kinetics with a higher accuracy (ρ > 0.99) than a network trained on multiple subjects (ρ > 0.9). An approach based on mechanical principles is applied for estimating the foot progression angle from a single foot-worn inertial sensor. Results show that the foot progression angle can be estimated with high accuracy compared to an optical reference (maximum mean error of 2.6◦). Finally, different motion capture approaches are compared during running, namely: based on inertial measurement units (processed with Xsens MVN Analyze) and optical markers (processed using Plug-In Gait and OpenSim Gait2392). The results show that mainly the sagittal plane has excellent correlation (ρ > 0.96) and RMSE (ρ < 6 degrees). The transversal and frontal planes showed less correlation. First steps towards reduced motion capture have been taken in this work, however, improvements are required for these techniques to be applied in different applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.