Abstract

This article reports a pair of studies that test two opposing hypotheses derived from music theory scholarship with regard to chord durations in popular music. The first hypothesis is that, regardless of tempo, chords will tend to last on average an ideal span of relative time, such as a bar. The second hypothesis is that, regardless of tempo, chords will tend to last on average an ideal span of absolute time, such as 2 s. Given the subjectivity of these parameters, three large encoded collections of harmony in popular music, each based on different musical styles and annotated by different musicians, were used to study the evidence for and against these two hypotheses. Average chord lengths were calculated for each song in the corpora based on geometric mean length in bars, geometric mean length in seconds, median length in bars, and median length in seconds. Following a description of the data-wrangling stages, the article reports the use of analysis of variance and linear regression models to examine the validity of each hypothesis. Although neither hypothesis was supported consistently, more evidence was found to support the second hypothesis that chord lengths tend to last on average an ideal span of absolute time, regardless of tempo. This finding suggests the existence of a perceptual ideal for chord durations in popular music that should be quantified in seconds rather than bars.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call