Abstract

Touch sensitive interfaces enable new interaction methods, like gesture commands. For users to easily memorize more than a dozen of gesture commands, it is important to enable gesture set customization. The classifier used to recognize drawn symbols must hence be customizable – able to learn from very few data – and evolving – able to learn new classes on-the-fly and improve during its use. The objective of this work is to obtain a gesture command system that cooperates as best as possible with the user: that learns from its mistakes without soliciting the user too often. This paper presents a novel approach for the online active learning of gesture commands, with three contributions. The IntuiSup supervisor monitors the learning process and user interactions. The Evolving Sampling by Uncertainty (ESU) algorithm enables to maintain the error/interaction compromise over time. The Boosted-ESU (B-ESU) method optimizes interaction impact to fasten system learning speed. The efficiency of our approach is evaluated on the publicly available ILG Data Base of gesture commands. Experimentation shows the effectiveness of the supervision strategy and improvements both in term of accuracy and learning speed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.