Abstract
Indoor localization is of great importance in the era of mobile computing. Smartphone-based pedestrian tracking is essential to a wide range of applications in shopping malls, industries, office buildings, and other public places. Current mainstream solutions rely on radio fingerprints and/or inertial sensors to distinguish and track pedestrians. However, these methods suffer from considerable deployment efforts and large accumulative errors. In recent years, the increasing numbers of security surveillance cameras installed in public areas provide a fresh perspective to overcome these drawbacks. However, in the real dynamic environments, fusing camera-based and inertial sensors-based pedestrian tracking is non-trivial due to the low robustness of visual tracking, incorrespondence of identifications and high complexity of computation. This paper presents the design and implementation of iPAC, an integrated inertial sensor and camera-based indoor localization and tracking system that achieves high accuracy in dynamic indoor environments with zero human effort. iPAC designs a robust visual detection and tracking algorithm to differentiate and track pedestrians in dynamic environments. Furthermore, iPAC employs a motion sequence-based matching algorithm to fuse raw estimates from both systems. By doing so, iPAC outputs enhanced accuracy, while overcoming the respective drawbacks of each sub-system. We implement iPAC on commodity smartphones and validate its performance in complex environments (including a laboratory, a classroom building, and a office building). The result shows that iPAC achieves a remarkable detection success rate of 93% and tracking success rate of 95% even suffering from severe line-of-sight blockages.
Highlights
Accurate indoor localization and tracking is the core technology that enables a wide variety of indoor based applications, such as customer navigation, augmented reality, and intelligent advertisements
The past decades have witnessed the fast development of numerous indoor localization and tracking techniques, including using wireless signal [1]–[4], cameras [5]–[7], Inertial measurement unit (IMU) [8]–[11], etc
To tackle the above challenges, in this paper, we propose iPAC, an integrated Pedestrian Dead Reckoning (PDR) and computer vision (CV) indoor localization and tracking system with zero human effort
Summary
Accurate indoor localization and tracking is the core technology that enables a wide variety of indoor based applications, such as customer navigation, augmented reality, and intelligent advertisements. The current deep learning-based pedestrian detection and tracking algorithm have high computational complexity, leading to the difficulty in meeting the real-time requirements even with good hardware support. DYNAMIC BACKGROUND DIFFERENCE BASED DETECTION For vision-based indoor localization and tracking, the traditional method resorting to machine learning is a bottleneck that restricts the system’s real-time performance. We observe that the background of the environment monitored by surveillance cameras is almost unchanged Based on this observation, we design a real-time pedestrian detection algorithm adopting dynamic background difference method. A real-time background image can be obtained based on this method, the computation complexity is extremely high if every frame is added to the queue to calculate the background image. The tracking algorithm can overcome blocking interference in a short period, and the users can still be correctly identified instead of giving a new identity when they enter the monitoring range again after leaving the camera monitoring range for a while
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have