Abstract

With the rapid development of multimedia technology, digital music has become increasingly available and it constitutes a significant component of multimedia contents on the Internet. Since digital music can be represented in various forms, formats, and dimensions, searching such information is far more challenging than text-based search. While some basic forms of music retrieval is available on the Internet, these tend to be inflexible and have significant limitations. Currently, most of these music retrieval systems only rely on shallow music information (e.g., metadata, album title, lyrics, etc). Here, we present an approach for deep content-based music information retrieval, which focuses on high-level human perception, incorporating subtle nuances and emotional impression on the music (e.g., music styles, tempo, genre, mood, instrumental combinations etc.). We also provide a critical evaluation of the most common current Music Information Retrieval (MIR) approaches and propose an innovative adaptive method for music information search that overcomes the current limitations. The main focus of our approach is concerned with music discovery and recovery by collaborative semantic indexing and user relevance feedback analysis. Through successive usage of our indexing model, novel music content indexing can be built from deep user knowledge incrementally and collectively by accumulating users’ judgment and intelligence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call