Abstract

An area of focus in music improvization is interactive improvization between a human and a computer system in real time. In this paper, we present a musical interactive system acting as a melody continuator. For each musical pattern given by the user, a new one is returned by the system which is built by using general patterns for both pitch and duration stored in its knowledge base. The latter consists of data mining rules extracted from different sets of melodies for different musical styles. The proposed system uses a new music representation scheme which treats separately pitch and duration. Also, it adopts a similarity measure initially developed for clustering categorical data. Moreover, we present experimental results, using Bach's Chorales and Jazz as test inputs, for both assessing the aesthetic quality of the proposed system and comparing it to human results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.