Abstract

We propose a novel approach for textual input which is based on air-writing recognition using smart-bands. The proposed approach enables the user to hand-write in the air in an intuitive and natural way, where text is recognized by analyzing the motion signals captured by an off-the-shelf smart-band worn by the user. Unlike existing studies that proposed the use of motion signals to recognize written letters, our approach does not require an extra dedicated device, nor it imposes unnecessary limitations on the writing process of the user. To test the feasibility of the new approach, we developed two air-writing recognition methods: a user-dependent method, based on K-Nearest-Neighbors with Dynamic-Time-Warping as the distance measure, and a user-independent method, based on a Convolutional-Neural-Network. The first creates a tailored model for each user, using a set of reference samples collected from the user in an enrollment phase, and therefore has the potential to be more accurate. The latter involves a preliminary training phase which generates a single model to fit all users, and therefore does not require an enrollment phase for new users. In order to evaluate our methods, we collected 15 sets of the English alpha-bet letters (written on the air and collected using a smart-band) from 55 different subjects. The results of our evaluation demonstrate the ability of the proposed methods to successfully recognize air-written letters with a high degree of accuracy, obtaining 89.2% average accuracy for the user-dependent method, and 83.2% average accuracy (95.6% when applying an auto-correction phase) for the user-independent method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.