Abstract

An algorithm for the robust detection and recognition of gestures for the interaction between human and a domestic floor cleaner robot is presented. The gestures are selected through a user study, in which the participants are asked to show natural gestures to the robot in given specific interaction scenarios. The gestures selected are those repeated by majority of participants and consist both commanding (say start cleaning) and social interaction (say greeting) gestures. The gesture recognition algorithm is developed using a combination of robust angular, positional, and directional features. The frontal and sagittal plains of human body are identified and invariant angular features are extracted from the skeletal data of a Kinect sensor. Robust positional and directional features are extracted by skeletal reconstruction using the invariant angular features and link rotation matrices. Dynamic time warping of features is done to make the algorithm robust to gesturing speed. Gestures are detected and recognized by calculating multiclass probability estimates by pairwise coupling method. The algorithm provided 97.26% recognition accuracy for a ten class robot commanding gesture database collected from multiple subjects.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.