Abstract

The development of computer technology and computer vision has had a significant positive impact on the daily lives of blind people, especially in efforts to improve their navigation abilities. This research has the main aim of introducing a superior object detection method, especially in supporting the sustainability and effectiveness of navigation for the blind. The main focus of the research is the use of YOLOv8, the latest version of YOLO, as an object detection method, and distance measurement technology from OpenCV. The main challenge to be addressed involves improving the accuracy and performance of object detection, which is an important key to ensuring safe and effective navigation for blind people. In this context, blind people often face obstacles in their mobility, especially when walking around environments that may be full of obstacles or obstacles. Therefore, better object detection methods become essential to ensure the identification of nearby objects, which may involve obstacles or potential threats, thereby preventing possible accidents or difficulties in daily commuting. Involving YOLOv8 as an object detection method provides the advantage of a high level of accuracy, although with a slight increase in detection duration and GPU power consumption compared to previous versions. The research results show that YOLOv8 provides a low error rate, with an average error percentage of 3.15%, indicating very optimal results. Using a combined performance evaluation approach of YOLOv8 and OpenCV distance measurement metrics, this research not only seeks to improve accuracy but also efficiency in detection time and power consumption. This research makes an important contribution in presenting technological solutions that can help improve mobility and safety for blind people, bringing a real positive impact in facilitating their daily lives.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call