Using one’s own smartphone for indoor navigation, especially for visually impaired individuals, can be a valuable and empowering tool to promote independence and safe mobility. Additionally, the integration of Pedestrian Dead Reckoning (PDR), which relies on smartphone sensors like accelerometers and gyroscopes, is a well-known and widely used approach in the field of indoor navigation. However, despite continuous improvements in sensor technology, addressing drift issues remains a principal problem in the effective implementation of PDR systems. On the other hand, although the use of Simultaneous Localization and Mapping (SLAM) technologies in smartphones for indoor navigation has also gained significant adoption in recent years, there are indeed challenges associated with SLAM, including low localization accuracy and potential tracking divergence. To overcome the above limitations, this paper presents a novel and promising solution for smartphone-based indoor localization, addressing the limitations of both PDR and SLAM through their synergistic integration. In fact, to correct the accumulated errors of the proposed localization system and thus improve the accuracy, our approach uses an ORB-SLAM camera and PDR inertial sensors via a Kalman filter to achieve back-end fusion. Experimental results show that our localization system significantly reduces the mean localization error by 62%, marking a considerable improvement over the standalone PDR and SLAM systems. Moreover, compared with state-of-the-art techniques, our system achieves high localization accuracy, with an improvement of 59%, making it highly suitable for today’s real-time applications, especially beneficial for visually impaired people.
Read full abstract