Abstract

Aim of the Study Brain-computer interfaces (BCIs) may help patients with severe neurological deficits communicate with the external world. Based on microelectrocorticography (µECoG) data recorded from the primary somatosensory cortex (S1) of unrestrained behaving rats, this study attempts to decode lever presses in a psychophysical detection task by using machine learning algorithms. Materials and Methods 16-channel Pt-Ir microelectrode arrays were implanted on the S1 of two rats, and µECoG was recorded during a vibrotactile yes/no detection task. For this task, the rats were trained to press the right lever when they detected the vibrotactile stimulus and the left lever when they did not. The multichannel µECoG data was analysed offline by time-frequency methods and its features were used for binary classification of the lever press at each trial. Several machine learning algorithms were tested as such. Results The psychophysical sensitivities (A') were similar and low for both rats (0.58). Rat 2 (B'': −0.11) had higher bias for the right lever than Rat 1 (B'': − 0.01). The lever presses could be predicted with accuracies over 66% with all the tested algorithms, and the highest average accuracy (78%) was with the support vector machine. Conclusion According to the recent studies, sensory feedback increases the benefit of the BCIs. The current proof-of-concept study shows that lever presses can be decoded from the S1; therefore, this area may be utilised for a bidirectional BCI in the future.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call