Abstract

Human Computer Interaction (HCI) focuses on the interaction between humans and machines. An extensive list of applications exists for hand gesture recognition techniques, major candidates for HCI. The list covers various fields, one of which is sign language recognition. In this field, however, high accuracy and robustness are both needed; both present a major challenge. In addition, feature extraction from hand gesture images is a tough task because of the many parameters associated with them. This paper proposes an approach based on a bag-of-words (BoW) model for automatic recognition of American Sign Language (ASL) numbers. In this method, the first step is to obtain the set of representative vocabularies by applying a K-means clustering algorithm to a few randomly chosen images. Next, the vocabularies are used as bin centers for BoW histogram construction. The proposed histograms are shown to provide distinguishable features for classification of ASL numbers. For the purpose of classification, the K-nearest neighbors (kNN) classifier is employed utilizing the BoW histogram bin frequencies as features. For validation, very large experiments are done on two large ASL number-recognition datasets; the proposed method shows superior performance in classifying the numbers, achieving an F1 score of 99.92% in the Kaggle ASL numbers dataset.

Highlights

  • The human computer interface (HCI) refers to the user inter-aces in a production or process-control system; it deals with the design, implementation, and assessment of new interfaces to improve the interaction between humans and machines [1], [2]

  • Gongfa [18] develops a method with moderate accuracy; based on a skeletonization algorithm and Convolutional Neural Network (CNN), it can reduce the effect of shooting angle and surroundings, both of which have a massive impact on recognition

  • An efficient American Sign Language (ASL) number recognition scheme is developed in this paper, using bag-of-words (BoW) histograms to ex-tract features

Read more

Summary

Introduction

The human computer interface (HCI) refers to the user inter-aces in a production or process-control system; it deals with the design, implementation, and assessment of new interfaces to improve the interaction between humans and machines [1], [2]. Automatic hand gesture recognition (HGR) has become a major concern for researchers [7], [8] It has been applied in some interesting fields, such as gaming, sign language recognition (SLR), and virtual reality [9]. There is a comprehensive and detailed analysis of existing research techniques for the recognition of sign language in [1] It is accompanied by a discussion of the usual challenges for gesture recognition systems; the work aims to guide entry into (and to facilitate increasing efforts in) the SLR research field. A user valuation study, [7], conducted on 25 visually challenged people with the aim of enabling the impaired community to use hand gestures to interact with machines, has led the way to the proposal of an innovative dactylology.

Objectives
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.