Abstract
The popular 3D image sensor on market urges fast developments of hand gesture-based applications. With utilizations of 3D space hand gesture data, hand gesture command classifications for control of the specific target device, such as smart speaker and smart TV equipments. However, context aware cognition such as human emotion recognition using 3D hand gesture action characteristics has been extremely rare to be explored. In this work, focused on the specific group with the restlessness emotion problem, a hand gesture action-based emotion recognition system with gesture-making user identification using the well-known Leap Motion sensor is developed. With acquiring 3D-space action variation information extracted from the gesture action made by the person, recognition of the specific human emotion categorization can then be done. In this study, there are ten different degrees of restlessness emotion behaviors are defined according to the designed restlessness emotion hand gesture categorization table that denotes restlessness characteristics of a restlessness hand gesture action, speed, repeat, attack. Three types of feature parameters derived from the Leap Motion sensor are presented, 78-dimension data with 3D hand gesture action variations, 7-dimension data with physical characteristics of the hand and 85-dimension data with combinations of 78-dimension and 7-dimension data. The presented 7-dimension feature parameter set is employed on hand gesture-making user identification, and the other two designed feature parameter sets are carried out for classifying the defined hand gesture actions with different restlessness emotion degrees. The popular K-nearest neighbor (KNN) approach is adopted to be as a gesture data classifier for performance evaluations of all these feature parameters in this study. Experiments on recognition calculations of 4 subjects in a laboratory office show that KNN with 85-dimension feature data has the averaged recognition accuracy of 80.6%, slightly superior to 80% of 78-dimension feature data. KNN with the 7-dimension feature parameter set will have the averaged performance of 66.1% on gesture-making user identification.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.