Abstract

This paper presents a novel orientation estimate approach named inertial guided visual sample consensus (IGVSAC). This method is intentionally designed for capturing the orientation of human body joints in free-living environments. Unlike the traditional visual-based orientation estimation methods, where outliers among putative image-pair correspondences are removed based on hypothesize-and-verify models such as the computationally costly RANSAC, our approach novelly exploits prior motion information (i.e., rotation and translation) deduced from the quick-response inertial measurement unit (IMU) as the initial body pose to assist camera in removing hidden outliers. In addition, our IGVSAC algorithm is able to ensure estimation accuracy even in the presence of a large quantity of outliers, thanks to its capability of rejecting apparent mismatches. The estimated orientation from the visual sensor is, in turn, able to correct long-term IMU drifts. We conducted extensive experiments to verify the effectiveness and robustness of our IGVSAC algorithm. Comparisons with highly accurate VICON and OptiTrack Motion Tracking Systems prove that our orientation estimate system is quite suitable for capturing human body joints.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.