Abstract

Development of noninvasive brain-machine interface (BMI) systems based on electroencephalography (EEG), driven by spontaneous movement intentions, is a useful tool for controlling external devices or supporting a neuro- rehabilitation. In this study, we present the possibility of brain-controlled robot arm system using arm trajectory decoding. To do that, we first constructed the experimental system that can acquire the EEG data for not only movement execution (ME) task but also movement imagery (MI) tasks. Five subjects participated in our experiments and performed four directional reaching tasks (Left, right, forward, and backward) in the 3D plane. For robust arm trajectory decoding, we propose a subject-dependent deep neural network (DNN) architecture. The decoding model applies the principle of bi-directional long short-term memory (LSTM) network. As a result, we confirmed the decoding performance (r-value: >0.8) for all X-, Y-, and Z-axis across all subjects in the MI as well as ME tasks. These results show the feasibility of the EEG-based intuitive robot arm control system for high-level tasks (e.g., drink water or moving some objects). Also, we confirm that the proposed method has no much decoding performance variations between ME and MI tasks for the offline analysis. Hence, we will demonstrate that the decoding model is capable of robust trajectory decoding even in a real-time environment.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.