Abstract

In order to strengthen the participation of stroke patients in rehabilitation training and weaken the dependence of steady-state visually evoked potential (SSVEP)-based brain-computer interface (BCI) on external stimuli equipment, we build an augmented-reality-based brain-computer interface (AR-BCI) system applied to rehabilitation exoskeleton. The system uses HoloLens to design a 4-category BCI, and adopts the sequential logic decoding method to control sixteen rehabilitation movements. In offline experiments, the performance of AR-BCI is compared with computer screen-based brain-computer interface (CS-BCI), and the effects of data length, number and position of electrodes on the performance of BCI are studied. Then, the instruction classification accuracy of AR-BCI and movement accuracy of exoskeleton are evaluated in online rehabilitation training. The average recognition accuracy of AR-BCI is 90.2% in offline experiments, which is smaller gap with CS-BCI. The recognition accuracy still reaches more than 90% when only Oz and O2 electrodes are used. The online results show that the instruction classification accuracy of AR-BCI is 88.9% and the averaged information transfer rates (ITR) is 30.01 bits min−1 under the data length of 2.5 s. The movement accuracy of exoskeleton is 91.12% with the ITR of 31.63 bits min−1, which is 2.2% higher than instruction recognition accuracy of AR-BCI. These results show that AR-BCI provides a high-performance and more friendly human–computer interaction method, and greatly improves the application potential of wearable BCI.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call