Abstract

Smartphones are in everyone’s hands for applications including navigation- and localization-based services, and scenario recognition is critical for seamless indoor and outdoor navigation. How to use smartphone sensing data to recognize different scenarios is a meaningful but challenging problem. To address this issue, we propose a structured grid-based deep-learning scenario recognition technique that uses smartphone global navigation satellite system (GNSS) measurements (satellite position, pseudorange, Doppler shift, and C/N0). In this work, the scenarios are grouped into four categories: deep indoors, shallow indoors, semioutdoors, and open outdoors. The proposed approach utilizes Voronoi tessellations to obtain structured-grid representations from satellite positions and performs computations using convolutional neural networks (CNNs) and convolutional long short-term memory (ConvLSTM) networks. With only spatial information being considered, the CNN model is used to extract the features of Voronoi tessellations for scenario recognition, achieving a high accuracy of 98.82%. Then, to enhance the robustness of the algorithm, the ConvLSTM network is adopted, which treats the measurements as spatiotemporal sequences, improving the accuracy to 99.92%. Compared with existing methods, the proposed algorithm is simple and efficient, using only GNSS measurements without the need for additional sensors. Furthermore, the latencies of the CNN and ConvLSTM models on a CPU are only 16.82 and 27.94 ms, respectively. Therefore, the proposed algorithm has the potential for real-time applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.