Abstract

This paper proposes an ego-motion tracking method that utilizes visual-inertial sensors for wearable blind navigation. The unique challenge of wearable motion tracking is to cope with arbitrary body motion and complex environmental dynamics. We introduce a visual sanity check to select accurate visual estimations by comparing visually estimated rotation with measured rotation by a gyroscope. The movement trajectory is recovered through adaptive fusion of visual estimations and inertial measurements, where the visual estimation outputs motion transformation between consecutive image captures, and inertial sensors measure translational acceleration and angular velocities. The frame rates of visual and inertial sensors are different, and vary with respect to time owning to visual sanity checks. We hence employ a multirate extended Kalman filter (EKF) to fuse visual and inertial estimations. The proposed method was tested in different indoor environments, and the results show its effectiveness and accuracy in ego-motion tracking.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call