Abstract

To overcome the limitation of Global positioning system (GPS) in indoor environments, various indoor positioning system have been developed using Wi-Fi, Bluetooth, Ultrawideband (UWB) and radio-frequency identification (RFID). Amongst them, Wi-Fi technologies are most commonly used for indoor navigation. Wi-Fi signals may be unavailable in some areas due to obstacles and unreachable coverages. Despite of it, the accuracy achieved by Wi-Fi is between 5–15 m that is unfavorable for visually impaired people. The popularity of beacons for positioning and smartphones with built-in inertial sensors plays a vital role in developing potential indoor navigation system. This paper presents a framework for visually impaired person (VIP) based on inertial sensors of smartphones and Bluetooth beacons. Beacons/proximity sensors in a building can help a pedestrian to navigate between two landmarks/points of interest via turn-by-turn navigation. However, there are certain areas in the building where external sensing is absent in a big hallway or dark alley. This model demonstrates that inertial sensors are useful to track a VIP in dark areas. Also. minimizes the use of external sensors between two landmarks/beacons. The performance of the proposed framework with the fusion algorithm in an android application is examined by conducting trajectory test on a smartphone. The experimental results of the walking traces show that the system has high accuracy with almost 1.5-2 m mean position error which could be improved further by implementing magnetometer based position learning techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call