Abstract

Although much research on Music Information Retrieval (MIR) has been done in the last decade, the input of the current MIR to specify a user query for finding a similar piece of music is still either by the existing old-fashioned keywords or by music contents. We aim to realize a new type of MIR equipped with brain-computer interfaces using electroencephalogram (EEG) signals. Toward the new MIR, we propose an architecture of MIR driven by EEG signals in this paper. While the architecture contains many issues to be solved, the point of the architecture is to construct user's music query in multi-layered aggregation of EEG signals. We describe in this paper the preliminary experiments conducted for selecting some appropriate low-level features for our multi-layered query construction and matching. It is obtained that the mental states of users while listening to music can be classified with high accuracy by using EEG signal aggregated features. We are starting development of detailed design of the architecture using the results described in the paper.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call