Abstract
Crowd-sensing-based localization is regarded as an effective method for providing indoor location-based services in large-scale urban areas. The performance of the crowd-sensing approach is subject to the poor accuracy of collected daily-life trajectories and the efficient combination of different location sources and indoor maps. This paper proposes a robust map-assisted 3D Indoor localization framework using crowd-sensing-based trajectory data and error ellipse-enhanced fusion (ML-CTEF). In the off-line phase, novel inertial odometry which contains the combination of 1D-convolutional neural networks (1D-CNN) and Bi-directional Long Short-Term Memory (Bi-LSTM)-based walking speed estimator is proposed for accurate crowd-sensing trajectories data pre-processing under different handheld modes. The Bi-LSTM network is further applied for floor identification, and the indoor network matching algorithm is adopted for the generation of fingerprinting database without pain. In the online phase, an error ellipse-assisted particle filter is proposed for the intelligent integration of inertial odometry, crowdsourced Wi-Fi fingerprinting, and indoor map information. The experimental results prove that the proposed ML-CTEF realizes autonomous and precise 3D indoor localization performance under complex and large-scale indoor environments; the estimated average positioning error is within 1.01 m in a multi-floor contained indoor building.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.