Abstract

In this paper, we present a comparative study of three different classification paradigms for genre classification based on repetitive basslines. In spite of a large variety in terms of instrumentation, a bass instrument can be found in most music genres. Thus, the bass track can be analysed to explore stylistic similarities between music genres. We present an extensive set of transcription-based high-level features related to rhythm and tonality that allows one to characterize basslines on a symbolic level. Traditional classification techniques based on pattern recognition techniques and audio features are compared with rule-based classification and classification based on the similarity between basslines. We use a novel dataset that consists of typical basslines of 13 music genres from different cultural backgrounds for evaluation purposes. Finally, the genre confusion results obtained in the experiments are examined by musicologists. Our study shows that several known stylistic relationships between music genres could be verified that way by classifying typical basslines. We could achieve a highest accuracy value of 64.8% for the genre classification solely based on repetitive basslines of a song.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call