Abstract

In recent years, people are used to listening to music because the music can effectively relax our tight life. Hence, how to retrieve the preferred music from a large amount of music data has been an attractive topic for many years. Traditionally, music retrieval contains two main types, namely text-based music retrieval and content-based music retrieval. However, these traditional music retrieval types ignore a human sense: emotion. That is, the preferred music might be different in different emotions. In fact, the emotion is highly related to the environment and it can be represented by brain actions. Therefore, in this paper, we propose a creative approach that performs a ubiquitous music search by content comparisons of brains and music. The major intent of this paper is to provide affective music retrieval in different contexts. Without any query, the context-brain triggers the music search and the context-related music will be retrieved by computing brain similarities and music similarities. The proposed approach was materialized and evaluated by a number of volunteers. The evaluation results reveal that, the proposed affective music retrieval can obtain high satisfactions for the invited testing users.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.