Abstract

PurposeThis paper aims to propose a novel methodology for classifying the gestures using support vector machine (SVM) classification method. Initially, the Red Green Blue color hand gesture image is converted into YCbCr image in preprocessing stage and then palm with finger region is segmented by threshold process. Then, distance transformation method is applied on the palm with finger segmented image. Further, the center point (centroid) of palm region is detected and the fingertips are detected using SVM classification algorithm based on the detected centroids of the detected palm region.Design/methodology/approachGesture is a physical indication of the body to convey information. Though any bodily movement can be considered a gesture, generally it originates from the movement of hand or face or combination of both. Combined gestures are quiet complex and difficult for a machine to classify. This paper proposes a novel methodology for classifying the gestures using SVM classification method. Initially, the color hand gesture image is converted into YCbCr image in preprocessing stage and then palm with finger region is segmented by threshold process. Then, distance transformation method is applied on the palm with finger segmented image. Further, the center point of the palm region is detected and the fingertips are detected using SVM classification algorithm. The proposed hand gesture image classification system is applied and tested on “Jochen Triesch,” “Sebastien Marcel” and “11Khands” data set hand gesture images to evaluate the efficiency of the proposed system. The performance of the proposed system is analyzed with respect to sensitivity, specificity, accuracy and recognition rate. The simulation results of the proposed method on these different data sets are compared with the conventional methods.FindingsThis paper proposes a novel methodology for classifying the gestures using SVM classification method. Distance transform method is used to detect the center point of the segmented palm region. The proposed hand gesture detection methodology achieves 96.5% of sensitivity, 97.1% of specificity, 96.9% of accuracy and 99.3% of recognition rate on “Jochen Triesch” data set. The proposed hand gesture detection methodology achieves 94.6% of sensitivity, 95.4% of specificity, 95.3% of accuracy and 97.8% of recognition rate on “Sebastien Marcel” data set. The proposed hand gesture detection methodology achieves 97% of sensitivity, 98% of specificity, 98.1% of accuracy and 98.8% of recognition rate on “11Khands” data set. The proposed hand gesture detection methodology consumes 0.52 s as recognition time on “Jochen Triesch” data set images, 0.71 s as recognition time on “Sebastien Marcel” data set images and 0.22 s as recognition time on “11Khands” data set images. It is very clear that the proposed hand gesture detection methodology consumes less recognition rate on “11Khands” data set when compared with other data set images. Hence, this data set is very suitable for real-time hand gesture applications with multi background environments.Originality/valueThe modern world requires more numbers of automated systems for improving our daily routine activities in an efficient manner. This present day technology emerges touch screen methodology for operating or functioning many devices or machines with or without wire connections. This also makes impact on automated vehicles where the vehicles can be operated without any interfacing with the driver. This is possible through hand gesture recognition system. This hand gesture recognition system captures the real-time hand gestures, a physical movement of human hand, as a digital image and recognizes them with the pre stored set of hand gestures.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.