Abstract
This study presents and assesses a multi-sensory hand pattern recognizer. We experimentally evaluated multiple features from the selected sensors, detected the most significant ones, and set the best combination possible to reach optimal hand shape detection. multilayer perceptron (MLP) and Decision Tree (DT) classifiers were used, and we used these two classification techniques to ensure that the results were not related to the classification technique. Leap Motion (LM) controllers and a VIVE Pro Eye camera were used to detect hand keypoints, and those last sensors are vision-based. We integrated this with a MYO armband, which streams eight EMG channels and inertial measurement unit data. All the sensors’ data were synchronized and streamed in real-time to be treated and fused. We evaluated our integration on the classification of the American Sign Language (ASL). We worked with seven participants, and we detected the best features subset. The overall results were satisfactory, wherein we reached an average classification of 98.31% with the optimal subset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.