Abstract

Expressive music performance and cardiac arrhythmia can be viewed as deformations of, or deviations from, an underlying pulse stream. I propose that the results of these pulse displacements can be treated as actual rhythms and represented accurately via a literal application of common music notation, which encodes proportional relations among duration categories, and figural and metric groupings. I apply the theory to recorded music containing extreme timing deviations and to electrocardiographic (ECG) recordings of cardiac arrhythmias. The rhythm transcriptions are based on rigorous computer-assisted quantitative measurements of onset timings and durations. The root-mean-square error ranges for the rhythm transcriptions were (19.1, 87.4) ms for the music samples and (24.8, 53.0) ms for the arrhythmia examples. For the performed music, the representation makes concrete the gap between the score and performance. For the arrhythmia ECGs, the transcriptions show rhythmic patterns evolving through time, progressions which are obscured by predominant individual beat morphology- and frequency-based representations. To make tangible the similarities between cardiac and music rhythms, I match the heart rhythms to music with similar rhythms to form assemblage pieces. The use of music notation leads to representations that enable formal comparisons and automated as well as human-readable analysis of the time structures of performed music and of arrhythmia ECG sequences beyond what is currently possible.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call