Abstract

The lack of indoor floor plans is one of the major obstacles to ubiquitous indoor location-based services. Dedicated mobile robots with high-precision sensors can measure and produce accurate indoor maps, but the deployment remains low for the public. Computer vision techniques are adopted by some existing smartphone-based methods to build the 3D point cloud, which have the cost of a quantity of the efforts of image collection and the risk of privacy issues. In this paper, we propose BatMapper-Plus which adopt acoustic ranging and inertial tracking to construct precise and complete indoor floor plans on smartphones. It emits acoustic signals to measure the distance from the smartphone to a neighbouring wall segment, and produces accessible areas by surrounding the building during walking. It also refines the constructed indoor floor plan to eliminate scattered segments, and identifies connection areas, including stairs and elevators among different floors. In addition, we propose an LSTM-based dead-reckoning model which is trained by outdoor IMU readings and GPS records, and use it to infer the step length during indoor walking, thereby improving the floor plan quality. We also elaborate how to use the constructed map for indoor navigation, i.e., a Dynamic Time Warping algorithm which automatically matches current inertial readings and historical sensory data during map construction to produce fine-grained walking guidance. To show our effectiveness compared with the state-of-the-art, we carry out extensive experiments in a teaching building and a residential building. It proves that our method is efficient without any privacy concerns and texture/illumination limitations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call