Abstract
Hand-free control of assistive mobility devices has been developed to serve people with movement disabilities at all levels. In this study, we demonstrate a human–machine interface (HMI) system that uses piezoelectric sensors to translate face and tongue movements. This study addresses two issues. First, we used six piezoelectric sensors to acquire muscular facial signals to observe the sensor positions and features during winking and tongue movements. Second, we verified the proposed HMI for online simulated wheelchair control. Twelve volunteers participated in the experiment. A maximum classification accuracy of 98.0% from the maximum and mean parameters could be achieved using the linear discriminant analysis and K-nearest neighbors classification algorithms. Using the proposed algorithm, command translation patterns for command translation reached more than 95% of the average classification accuracy with 0.5 s of the window for command creation. For online control of the simulated wheelchair, the results showed high efficiency based on the time condition. The combination of winking and tongue actions results in a steering time of the same magnitude as that of joystick-based control, which is less than twice the time of a joystick. Hence, the proposed system can be further implemented in a powered wheelchair for quadriplegic patients who retain control in the face or tongue muscles.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.