Abstract

User interfaces based on mid-air gesture recognition are expected to become popular in the near future due to the increasing diffusion of virtual, mixed reality applications and smart devices. The design of this kind of interfaces would be clearly helped by the availability of simple and effective methods to compare short 3D trajectories, allowing fast and accurate recognition of command gestures given a few examples. This approach, quite popular in 2D touch-based interfaces with the so-called “dollar” algorithm family, has not been deeply investigated for 3D mid-air gestures. In this paper, we explore several metrics that can be used for mid-air gesture comparison and present experimental tests performed to analyze their effectiveness on practical tasks. By adopting smart choices in gesture traces processing and comparing, it was possible to obtain very good results in the retrieval and recognition of simple command gestures, from complete or even partial hand trajectories. The approach was also extended in order to recognize gestures characterized by both hand and finger motions and tested on a recent benchmark, reaching state of the art performances.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.