The sense of sight is fundamental to how humans perceive and navigate their surroundings. For visually impaired individuals, this lack of vision creates significant challenges in detecting obstacles and measuring distances, making daily navigation difficult and potentially dangerous. In response to this problem, we propose an AI-based object detection, distance measurement, and speaking system integrated with a blind stick to enhance mobility and independence for the visually impaired. This system leverages cutting-edge artificial intelligence and image processing techniques to provide real-time object detection and distance measurement. By combining ultrasonic sensors with an AI-powered object recognition module, the system accurately identifies obstacles and calculates their distance, offering directional audio feedback to the user. Designed to be compact and user-friendly, the blind stick system provides seamless assistance in both indoor and outdoor environments. The object detection process utilizes advanced algorithms like YOLO (You Only Look Once) to identify obstacles. Distance measurement is performed using ultrasonic sensors to ensure accurate proximity readings. When an obstacle is detected, the system provides immediate auditory feedback, alerting users to the direction and distance of the object. The Google Maps API enhances usability by offering location awareness and route planning, empowering users to navigate with confidence. This AI-based blind stick system aims to improve the mobility and safety of visually impaired individuals, offering a more autonomous experience in their daily lives.
Read full abstract