Abstract

<p>Music software applications often require similarity-finding measures. In this study, we describe an empirically derived measure for determining similarity between two melodies with multiple-note changes. The derivation of our final model involved three stages. In Stage 1, eight standard melodies were systematically varied with respect to <em>pitch distance</em>, <em>pitch direction</em>, <em>tonal stability</em>, <em>metric salience</em> and <em>melodic contour</em>. Comparison melodies with a one-note change were presented in transposed and nontransposed conditions. For the nontransposed condition, predictors of explained variance in similarity ratings were pitch distance, pitch direction and melodic contour. For the transposed condition, predictors were tonal stability and melodic contour. In Stage 2, we added the effects of primacy and recency. In Stage 3, comparison melodies with two-note changes were introduced, which allowed us to derive a more generalizable model capable of accommodating multiple-note changes. In a follow-up experiment, we show that our empirically derived measure of melodic similarity yielded superior performance to the Mongeau and Sankoff similarity measure. An empirically derived measure, such as the one described here, has the potential to extend the domain of similarity-finding methods in music information retrieval, on the basis of psychological predictors.</p>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.