Abstract
Indoor positioning has been intensively studied recently due to the exploding demands of indoor mobile applications. While numerous works have employed wireless signals or dead-reckoning techniques, wearable computing poses new opportunities as well as challenges to the localization problem. This research studies the wearable localization problem by proposing a particle filter-based scheme to fuse the inputs from wearable inertial and visual sensors on human body. Specifically, the filter takes inertial measurements, wireless signals, visual landmarks, and indoor floor plans as inputs for location tracking. The inertial signals imply human body movements, the wireless signals indicate a rough absolute region inside a building, while the visual landmarks provide relative angles viewed from particular positions to these markers. Furthermore, a head-mounted display provides intuitive and friendly interfaces to users. The proposed system has also been prototyped and tested in our campus, and the experiments demonstrate an average localization error of about one meter.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.