Abstract

During the last two decades or so, information theory has been used as a tool to describe the probabilistic components of notated music. It has served, albeit only on occasion, both for the analysis and, somewhat secondarily, for the synthesis of music. In most cases, the information content in the pitch and/or interval structures of melodic lines has been assessed;1 entropies have been derived for the alphabets-pitches, pitch sequences, pitch intervals, and so forth-from which individual melodies or collections of melodies were assembled. The need to apply information theory to symbolic representations of music-rather than in some way to music itself-arises from the nature of information theory. Mathematical formulations of information content are not descriptive of conveyed meaning, but rather describe distributional aspects of the symbolic characters used in the transmission of that meaning. The input data to the calculations of information theory are encodings of a communication that can be described statistically. For written language (one of the areas for which information theory was originally developed and where its efficacy has been effectively

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.