Abstract
Noncontact gesture recognition is gradually being applied to emerging applications, such as smart cars and smart phones. Negative latency gesture recognition (recognition before a gesture is finished) is desirable due to the instantaneous feedback. However, it is difficult for existing methods to achieve a high precision and negative latency gesture recognition. A fragment can provide too few features to directly identify all gestures well. By observing a large number of existing gesture sets and people’s daily operating habits, we found that some high frequency used gestures are similar. To the best of our knowledge, it is the first time to redivide the gestures into two subsets according to their movement physical states. We divided the gestures with different shapes or motion states into a parent-class subset, and further divided each pair of parent-class gestures to obtain a child-class subset. In order to achieve a better tradeoff between the high-precision and negative latency, an approach of motion pattern and behavior intention (MPBI) is proposed. Taking full advantage of the characteristics of each subset, MPBI includes two models. First, pattern model coarsely classify the parent-class gestures by a convolutional network, and then intention model further classifies child-class gestures according to their opposite motion direction. MPBI is evaluated on a 340-GHz terahertz radar. With the advantage of its accurate ranging, intention model can recognize child-class gestures directly without training. MPBI is evaluated on 12 gestures and achieves a recognition accuracy of 94.13%, which only needs a 0.033-s gesture fragment as an input sample.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.