Abstract

Finding pieces with a similar emotional distribution throughout the same composition was the aim of this work. A comparative analysis of musical performances by using emotion tracking was proposed. A dimensional approach of dynamic music emotion recognition was used in the analysis. Music data annotation and regressor training were done. Values of arousal and valence, predicted by regressors, were used to compare performances. The obtained results confirm the validity of the assumption that tracking and analyzing the values of arousal and valence over time in different performances of the same composition can be used to indicate their similarities. Detailed results of analyzing different performances of Prelude No.1 by Frédéric Chopin were presented. They enabled to find the most similar performances to the performance by Arthur Rubinstein, for example. The author found which performances of the same composition were closer to each other and which were quite distant in terms of the shaping of arousal and valence over time. The presented method gives access to knowledge on the shaping of emotions by a performer, which had previously been available only to music professionals.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.