This study proposes an eXtended Reality (XR) glasses-based walking assistance system to support independent and safe outdoor walking for visually impaired people. The system leverages the YOLOv8n deep learning model to recognize walkable areas, public transport facilities, and obstacles in real time and provide appropriate guidance to the user. The core components of the system are Xreal Light Smart Glasses and an Android-based smartphone, which are operated through a mobile application developed using the Unity game engine. The system divides the user’s field of vision into nine zones, assesses the level of danger in each zone, and guides the user along a safe walking path. The YOLOv8n model was trained to recognize sidewalks, pedestrian crossings, bus stops, subway exits, and various obstacles on a smartphone connected to XR glasses and demonstrated an average processing time of 583 ms and an average memory usage of 80 MB, making it suitable for real-time use. The experiments were conducted on a 3.3 km route around Bokjeong Station in South Korea and confirmed that the system works effectively in a variety of walking environments, but recognized the need to improve performance in low-light environments and further testing with visually impaired people. By proposing an innovative walking assistance system that combines XR technology and artificial intelligence, this study is expected to contribute to improving the independent mobility of visually impaired people. Future research will further validate the effectiveness of the system by integrating it with real-time public transport information and conducting extensive experiments with users with varying degrees of visual impairment.
Read full abstract