Abstract

Cameras and Inertial Measurement Units are widely used for motion tracking and general activity recognition. Sensor fusion techniques, which employ both Vision- and IMU-based tracking, rely on their precise synchronization in time and relative pose calibration. In this work, we propose a novel technique for solving both time and relative pose calibration between an optical target (OT) and an inertial measurement unit (IMU). The optical tracking system gathers 6DoF position and rotation data of the OT and the proposed approach uses them to simulate accelerometer and gyroscope readings to compare them against real ones recorded from the IMU. Convergence into the desired result of relative pose calibration is achieved using the adaptive genetic algorithm.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.