Visually impaired individuals frequently encounter difficulties in detecting and avoiding obstacles in the wild. To address this issue, we propose an obstacle detection method for visual navigation assistance, named YOLO-OD. To improve the ability to detect and differentiate between different sized obstacles in outdoor environments, we introduce the Feature Weighting Block (FWB), which improves feature importance discrimination. To address the challenges of detecting cluttered outdoor environments and handling occlusions, we introduce the Adaptive Bottleneck Block (ABB), which captures varying features across different scenes. To solve the problem of detecting relatively small obstacles in outdoor environments, we propose the Enhanced Feature Attention Head (EFAH). The proposed YOLO-OD achieves an average precision of 30.02% on a public dataset, making it a worth studying approach for blind and visually impaired navigation aids.Our study effectively addresses the navigation challenges faced by visually impaired individuals by improving model performance, thereby enhancing its practical values. The code for YOLO-OD has been made publicly available to ensure reproducibility and facilitate further research.
Read full abstract7-days of FREE Audio papers, translation & more with Prime
7-days of FREE Prime access