Abstract

An algorithm for flexible mapping (FM) from one gesture to multiple semantics in the same situational context is presented for the first time to reduce the cognitive and operating loads of an operator by using gesture commands. First, the foundation of the FM is built with the semantic-oriented difference features of behavior model (SDFBM) of the operator as implicit inputs. Second, the gesture–semantics FM algorithm and its extended process, namely, attribute classification for FM are proposed and implemented. Third, five commonly used gestures are designed to demonstrate the way a gesture is mapped to multiple semantics. Finally, several comparative experimental results are provided to demonstrate the superiority of the proposed methods to the state of the art. The main innovation of this study is that the same gesture in the same context can be mapped to several different semantics through the SDFBM feature recognition. This study provides an intelligent and natural interaction interface model for 3D platforms and key support for gesture-based implicit interaction design. The proposed algorithms are also tested or used in several applications, such as smart teaching interface and onboard vehicular systems and intelligent TV.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.