Abstract

Abstract In this study, a machine learning-based system, which recognises the Turkish sign language person-independent in real-time, was developed. A leap motion sensor was used to obtain raw data from individuals. Then, handcraft features were extracted by using Euclidean distance on the raw data. Handcraft features include finger-to-finger, finger -to-palm, finger -to-wrist bone, palm-to-palm and wrist-to-wrist distances. LR, k-NN, RF, DNN, ANN single classifiers were trained using the handcraft features. Cascade voting approach was applied with two-step voting. The first voting was applied for each classifier’s final prediction. Then, the second voting, which voted the prediction of all classifiers at the final decision stage, was applied to improve the performance of the proposed system. The proposed system was tested in real-time by an individual whose hand data were not involved in the training dataset. According to the results, the proposed system presents 100 % value of accuracy in the classification of one hand letters. Besides, the recognition accuracy ratio of the system is 100 % on the two hands letters, except “J” and “H” letters. The recognition accuracy rates were 80 % and 90 %, respectively for “J” and “H” letters. Overall, the cascade voting approach presented a high average classification performance with 98.97 % value of accuracy. The proposed system enables Turkish sign language recognition with high accuracy rates in real time.

Highlights

  • Individuals with speaking and hearing impairments use body and hand gestures to communicate with other individuals

  • The Logistic Regression (LR), K-Nearest Neighbourhood (k-NN), Random Forest (RF), Deep Neural Network (DNN) and Artificial Neural Network (ANN) classifiers employed for classifying the Turkish Sign Language Alphabet are explained

  • The study explained the new handcraft feature extraction method, which was based on mathematical model and allowed for recognition independent of these factors or the person performing the sign language

Read more

Summary

INTRODUCTION

Individuals with speaking and hearing impairments use body and hand gestures to communicate with other individuals These signals using gestures and facial expressions are named Sign Language (SL) [1]. LM sensor uses two infra-red IR cameras and three infra-red LEDs to observe a hemispheric area approximately with one meter distance It was especially designed for recognising hand and finger movements. Turkish sign language recognition system was developed in real time. A hybrid machine learning mechanism (including two voting methods) has been proposed for Turkish sign language alphabet recognition. 3. Experimental validation was performed in real time by a person whose sign language data was not included in the training dataset. In the third section of the article, data collection, handcrafted feature extraction method, cascade voting approach are presented.

RELATED STUDIES
THE PROPOSED SYSTEM
Dataset
Handcraft Feature Extraction
Classification
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.