Abstract

In this paper, we propose a method of personal positioning for a wearable augmented reality (AR) system that allows a user to freely move around indoors and outdoors. The user is equipped with self-contained sensors, a wearable camera, an inertial head tracker and display. The method is based on sensor fusion of estimates for relative displacement caused by human walking locomotion and estimates for absolute position and orientation within a Kalman filtering framework. The former is based on intensive analysis of human walking behavior using self-contained sensors. The latter is based on image matching of video frames from a wearable camera with an image database that was prepared beforehand.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call