The purpose of this research is to address the challenges faced by visually impaired individuals, particularly in handling household appliances independently. With approximately 285 million visually impaired individuals worldwide, technological solutions are crucial to enhancing their accessibility and independence. This paper introduces a Smart Assistance System designed to empower visually impaired individuals to interact with household appliances in real-time without assistance. In this study, three Convolutional Neural Network (CNN) algorithms are compared to develop the system. The evaluation metrics include accuracy, precision, recall, F1 score, and hamming loss on validation images. The performance comparison reveals that the custom architecture CNN, MobileNetv2, and YOLO models achieve F1 scores of 0.43, 0.63, and 0.24, respectively. To enhance object detection and classification, the paper suggests implementing bounding box buttons categorization using YOLOv8, which demonstrates superior performance with a 95% classification accuracy on testing images of home appliance buttons. They face similar difficult while in public and accessing public property. Expanding upon the proposed system’s capabilities, the paper introduces the concept of panic button detection and activation in a bus environment tailored for blind individuals. This system relies on various factors such as the number of people onboard, heart rate monitoring, and the detection of distress signals or SOS sounds emitted by the user. By integrating advanced sensing technologies and intelligent algorithms, this panic button detection system aims to provide prompt assistance and ensure the safety of visually impaired passengers in public transportation settings.
Read full abstract