Abstract
Sign language is a way of communication for people with hearing and speech impairments. Sign language is generally done using hand gestures to articulate sentences. Different countries use different sign languages, increasing the complexity of sign language recognition. Communication between non-disabled and disabled people is difficult because non-disabled people do not learn sign language. This research aims to develop a device that can translate sign language specific to telling body conditions so that people without disabilities can understand. The device uses sensors to determine hand movement patterns to recognize sign language. The sign language recognized in this study focuses on body conditions and diseases such as asthma, cough, dizziness, depression, and tonsils. The results of the test show that the Artificial Neural Network (ANN) model has good performance with 98% accuracy value, 98% precision value, 98% recall, and 98% F1-Score. The test was conducted after the model was implanted and used to perform sign language testing. The parameter value of the confusion matrix shows high results. It can be concluded that the model created can be used to translate sign language to express the condition of tonsils, cough, depression, dizziness, and asthma.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have