Abstract

The American Sign Language (ASL) is the only major language used in the educational system of hearing-impaired people in Nigeria. The automatic recognition system of the signs has not been currently used for the teaching of hearing-impaired students. This study developed a realtime large vocabulary sign language for ASL implemented on android devices. Samples of static and dynamic hand gestures were collected from the primary school of handicap, Osogbo. The specific objectives include the collection of hand gestures, examining the specific features for the recognition process, designing a model for the specific features examined, implementing the model, and evaluating the performance of the system. The real-time vocabulary sign language was recognized using Convolution Neural Network (CNN) implemented using Python programming language. The developed system was evaluated using precision, recall and accuracy as metrics. The model prediction carried out using the test image has an overall accuracy of 92.98%. The obtained result showed that the system will enhance the learning skills and provide adequate learning platform for both students and the teachers of hearing-impaired schools.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call