Abstract

One of the most promising methods to assist amputated or paralyzed patients in the control of prosthetic devices is the use of a brain computer interface (BCI). The use of a BCI allows the communication between the brain and the prosthetic device through signal processing protocols. However, due to the noisy nature of the brain signal, available signal processing protocols are unable to correctly interpret the brain commands and cannot be used beyond the laboratory setting. To address this challenge, in this work we present a novel automatic brain signal recognition protocol based on vowel articulation mode. This approach identifies the mental state of imagery of open-mid and closed vowels without the imagination of the movement of the oral cavity, for its application in prosthetic device control. The method consists on using brain signals of the language area (21 electrodes) with the specific task of thinking the respective vowel. In the prosecution stage, the power spectral density (PSD) was calculated for each one of the brain signals, carrying out the classification process with a Support Vector Machine (SVM). A measurement of precision was achieved in the recognition of the vowels according to the articulation way between 84% and 94%. The proposed method is promissory for the use of amputated or paraplegic patients.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.