Abstract

The algorithm presented in this paper provides the means for the real-time recognition of the key signature associated with a given piece of music, based on the analysis of a very small number of initial notes. The algorithm can easily be implemented in electronic musical instruments, enabling real-time generation of musical notation. The essence of the solution proposed herein boils down to the analysis of a music signature, defined as a set of twelve vectors representing the particular pitch classes. These vectors are anchored in the center of the circle of fifths, pointing radially towards each of the twelve tones of the chromatic scale. Besides a thorough description of the algorithm, the authors also present a theoretical introduction to the subject matter. The results of the experiments performed on preludes and fugues by J.S. Bach, as well as the preludes, nocturnes, and etudes of F. Chopin, validating the usability of the method, are also presented and thoroughly discussed. Additionally, the paper includes a comparison of the efficacies obtained using the developed solution with the efficacies observed in the case of music notation generated by a musical instrument of a reputable brand, which clearly indicates the superiority of the proposed algorithm.

Highlights

  • The tonality of musical pieces is inextricably linked to the musical notation, in which the key signatures play an important role

  • Advances in musical instrumentation enabled automation of the process of musical notation generation based on the analysis of a recorded piece of music, e.g., some of the modern musical instruments provide this kind of functionality, enabling the presentation of the generated musical notation on the built-in

  • The algorithm presented in this paper provides the means for an effective, real-time recognition of the key signature associated with a given piece of music, based on the analysis of a very small number of initial notes

Read more

Summary

Introduction

The tonality of musical pieces is inextricably linked to the musical notation, in which the key signatures play an important role. Advances in musical instrumentation enabled automation of the process of musical notation generation based on the analysis of a recorded piece of music, e.g., some of the modern musical instruments provide this kind of functionality, enabling the presentation of the generated musical notation on the built-in. The following question arises: how to effectively, i.e., in real-time and based on a few sounds, determine the key signature of a given piece of music? The authors of this paper started their search for an effective algorithm enabling recognition of the key signature corresponding to a given piece of music as a result of their experiences with electronic keyboard instruments providing the functionality of a real-time presentation of the musical notation representing the currently played song. The approach presented in this paper could potentially be applied to improve the effectiveness of the automated generation of music notation in terms of the music key signature recognition. In the case of misclassification, the decision can be corrected and the already generated music notation retranscribed

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.