Abstract

Indoor navigation is a representative application of an indoor positioning system that uses a variety of equipment, including smartphones with various sensors. Many indoor navigation systems utilize Wi-Fi signals, as well as a variety of inertial sensors, such as a 3D accelerometer, digital compass, gyroscope, and barometer, to improve the accuracy of user location tracking. The inertial sensors are vulnerable to changes in the surrounding environments and sensitive to users behavior, but little research has been conducted on sensor fusion under these conditions. In this paper, we propose a dynamic sensor fusion framework (DSFF) that provides accurate user tracking results by dynamically calibrating inertial sensor readings in a sensor fusion process. The proposed method continually learns the errors and biases of each sensor due to the changes in user behavior patterns and surrounding environments. The learned patterns are then dynamically applied to the user tracking process to yield accurate results. The results of experiments conducted in both a single-story and a multi-story building confirm that DSFF provides accurate tracking results. The scalability of the DSFF will enable it to provide more accurate tracking results with various sensors, both existing and under development.

Highlights

  • THE recent evolution of the various sensors installed in smartphones has enabled hybrid indoor positioning that integrates various signals and sensor readings for more accurate user tracking

  • We propose a dynamic sensor fusion framework that can dynamically learn the pattern of errors, bias, and reliability of each sensor at run-time to adapt to a user’s tracking environment and individual movement pattern

  • The dynamic sensor fusion (DSF) may fail if the sensor outlier continues to appear as an extreme error that deviates from the existing pattern

Read more

Summary

Introduction

THE recent evolution of the various sensors installed in smartphones has enabled hybrid indoor positioning that integrates various signals and sensor readings for more accurate user tracking. The sensor data includes errors and biases caused by the dynamically changing environment, self-contained sensor, etc., further confusing the determination of the weight of each sensor. If the sensor error patterns affected by this dynamic environment were properly interpreted and the reliability of each sensor revealed, a more accurate tracking result could be realized by a more sophisticated fusion of multiple sensors. Many researchers have already proposed user tracking estimation that focuses on sensor fusion methods [1], [2], [3], [4], [5], [6], [7], in which the sensors independently determine the user’s position.

Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.