Abstract

A sign language recognition system for low-resource Sinhala Sign Language using Leap Motion (LM) and Deep Neural Networks (DNN) has been presented in this paper. The study extracts static and dynamic features of hand movements of Sinhala Sign Language (SSL) using a LM controller which acquires the position of the palm, radius of hand sphere and positions of five fingers, and the proposed system is tested with the selected 24 letters and 6 words. The experimental results prove that the proposed DNN model with an average testing accuracy of 89.2% outperforms a Naïve Bayes model with 73.3% testing accuracy and a Support Vector Machine (SVM) based model with 81.2% testing accuracy. Therefore, the proposed system which uses 3D non-contact LM Controller and machine learning model has a great potential to be an affordable solution for people with hearing impairment when they communicate with normal people in their day-to-day life in all service sectors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call