Abstract
Optimization-based digital human model research has gained significant momentum among various human models. Any task can be formulated to an optimization problem, and the model can predict not only postures but also motions. However, these optimization-based digital human models need validation using experiments. The motion capture system is one of the ways to validate predicted results. This paper summarizes the progress of motion capture experiment efforts at the Human-Centric Design Research (HCDR) Laboratory at Texas Tech University. An eight-camera motion capture system has been set up in our research lab. Marker placement protocols have been developed where markers are placed on the subjects to highlight bony landmarks and identify segments between joints in line with previously identified guidelines and suggestions in literature. A posture reconstruction algorithm has been developed to map joint angles from motion capture experiments to digital human models. Various studies have been conducted in the lab involving motion capture experiments for jumping, standing and seated reach, and pregnant women's walking, sit to standing, seated reach, and reach with external loads. The results showed that the posture reconstruction algorithm is useful and accurate to transfer motion capture experiment data to joint angles. Marker placement protocol is reliable to capture all joints. The main task of the motion caption system is to validate all optimization-based digital human models developed by other research members at the HCDR Lab.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.