Abstract

Brain-computer interface (BCI) provides an alternative way for lock in patients to interact with the environment solely based on neural activity. The drawback with independent BCIs is their lack in the number of commands – usually only two are available. This provides great challenges to the BCI’s usefulness and applicability in real-life scenarios. A potential avenue to increasing the number of independent BCI commands is through modulating brain activity with music, in regions as the orbitofrontal cortex. By quantifying oscillatory signatures such as alpha rhythm at the frontal cortex, we can obtain a greater understanding of the effect of music at the cortical level. Similar to how desynchronization patterns during motor imagery is comparable to that in real movement, the imagination of music elicits response from the auditory cortex as if the sound is actually heard. We propose an experimental paradigm to train subjects to elicit discriminative brain activation pattern with respect to imagining high and low music scales. Each trial of listening to music (high or low scale) was followed by its respective music imagery. The result of this three subjects experiment achieved over 70% accuracy in independent BCI performance and showed similar distributions in Discriminative Brain Patterns (DBP) between high and low music listening and imagination, as shown in Fig. 1. This pilot study opens an avenue for increasing BCI commands, especially in independent BCI, which are currently very limited. It also provides a potential channel for music composition. Open image in new window Fig. 1. Sound stimuli of high and low scale composed with online NoteFlight software. (1) Shows the high music scale and (2) the low scale music scale.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call