Abstract

Indonesian deaf people utilize SIBI to communicate using spoken words, gestures, facial expressions, and body language. SIBI, certified for Special Schools (SLB), helps deaf pupils communicate. This project implements SIBI (Indonesian Sign Language System) a real-time detection algorithm at Sekolah Luar Biasa Negeri 1 Tabanan using image processing and YoloV8 ultralytics deep learning. The program trains a sign language gesture detection model on Google Colab's GPU. The SIBI sign language images were used to train a YoloV8 object detection model. The camera captures movements, which the YoloV8 algorithm trained on SIBI gesture data processes. It can recognize gestures in real time and generate text to non-sign language users. The dataset has 107 class vocabulary and 7 class affix prefixes for complete gesture recognition. Shirt color, room brightness, and webcam quality affect detection rates. Optimal detection accuracy is 87.74% and subpar 58.02%. Despite these limitations, the strategy helps deaf students communicate more effectively with non-sign language speakers. This program improves inclusivity and communication in schools, making learning easier for hearing-impaired pupils. This work provides a reliable and quick sign language identification system to help deaf educators and caregivers with daily interactions and education.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call