Abstract
Mood is an important aspect of music and knowledge of mood can be used as a basic feature in music recommender and retrieval systems. A listening experiment was carried out establishing ratings for various moods and a number of attributes, e.g., valence and arousal. The analysis of these data covers the issues of the number of basic dimensions in music mood, their relation to valence and arousal, the distribution of moods in the valence–arousal plane, distinctiveness of the labels, and appropriate (number of) labels for full coverage of the plane. It is also shown that subject-averaged valence and arousal ratings can be predicted from music features by a linear model.
Highlights
Music recommendation and retrieval is of interest due to the increasing amount of audio data available to the average consumer
Validation of axis interpretation On the basis of the eigenvectors we argued that the two main dimensions in music mood are associated with valence and arousal
It is clear that the average error nicely agrees with the expected value based on the measurement noise since the error S lies in the 95% confidence interval. These results show that subject-averaged valence and arousal ratings can adequately be predicted using features automatically extracted from the music
Summary
Music recommendation and retrieval is of interest due to the increasing amount of audio data available to the average consumer. Experimental data on similarity in mood of different songs can be instrumental in defining musical distance measures [1,2] and would enable the definition of prototypical songs (or song features) for various moods. These latter can be used as the so-called mood presets in music recommendation systems. With this in mind, we defined an experiment to collect the relevant data. The data collected in earlier studies on music mood [3,4,5,6,7,8,9,10,11,12] only partially meet these requirements
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have