Abstract

Human Machine Interface (HMI) application based on Electrooculogram (EOG) signals for converting user intention into control command finds promising scope in development of prosthetic devices for persons suffering from motor impairment. In the present work, the EOG signals based control system has been investigated in offline environment. The signal has been acquired through g.LADYbird active electrodes placed at distinct positions on human face around the eyes. A classifier model has been trained by feature matrix which encapsulates the time domain features extracted by using Dual Tree Complex Wavelet Transform (DTCWT). Linear Support Vector Machine (SVM) classifier has been used to develop a classified trained model by using 240 training data sets recorded from 12 healthy subjects. The MATLAB simulation showed 99.2% classification accuracy for horizontal eye movement in two directions, left and right. The classified signals have been converted into commands through Arduino to grasp and release an object by prosthetic myoelectric hand.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.