Abstract

Rhythmic information plays an important role in Music Information Retrieval. Example applications include automatically annotating large databases by genre, meter, ballroom dance style or tempo, fully automated D.J.-ing, and audio segmentation for further retrieval tasks such as automatic chord labeling. In this article, we therefore provide an introductory overview over basic and current principles of tempo detection. Subsequently, we show how to improve on these by inclusion of ballroom dance style recognition. We introduce a feature set of 82 rhythmic features for rhythm analysis on real audio. With this set, data-driven identification of the meter and ballroom dance style, employing support vector machines, is carried out in a first step. Next, this information is used to more robustly detect tempo. We evaluate the suggested method on a large public database containing 1.8 k titles of standard and Latin ballroom dance music. Following extensive test runs, a clear boost in performance can be reported.

Highlights

  • Music Information Retrieval (MIR) has been a growing field of research over the last decade

  • A data-driven rhythm analysis approach is introduced, capable of extracting rhythmic features, robustly identifying duple and triple meter, quarter-note tempo and ballroom dance style basing on 82 rhythmic features, which are described

  • Preliminary test runs for discrimination between 6 genres (Documentary, Chill, Classic, Jazz, Pop-Rock, and Electronic) on the same dataset, and with same test-conditions as used in [31] indicate accuracies of up to 70% using only the 83 rhythmic features

Read more

Summary

INTRODUCTION

Music Information Retrieval (MIR) has been a growing field of research over the last decade. Very little work exists that combines various low-level detection methods, such as tempo induction, meter recognition, and beat tracking into a system that is able to use features from all these subtasks to perform robust high-level classification tasks, for example, ballroom dance style or genre recognition, and in turn use the classification results to improve the low-level detection results. Few, such as [11, 12], present datadriven genre and meter recognition.

RELATED WORK
RHYTHM ANALYSIS
Comb filter tempo analysis
Feature extraction
Preprocessing
Tatum features
Meter features
Feature selection
Song database
Data-driven meter and ballroom dance style recognition
From ballroom dance style to tempo
RESULTS
CONCLUSION AND OUTLOOK
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call