Listeners' expectations for melodies and harmonies in tonal music are perhaps the most studied aspect of music cognition. Long debated has been whether faster response times (RTs) to more strongly primed events (in a music theoretic sense) are driven by sensory or cognitive mechanisms, such as repetition of sensory information or activation of cognitive schemata that reflect learned tonal knowledge, respectively. We analyzed over 300 stimuli from 7 priming experiments comprising a broad range of musical material, using a model that transforms raw audio signals through a series of plausible physiological and psychological representations spanning a sensory-cognitive continuum. We show that RTs are modeled, in part, by information in periodicity pitch distributions, chroma vectors, and activations of tonal space--a representation on a toroidal surface of the major/minor key relationships in Western tonal music. We show that in tonal space, melodies are grouped by their tonal rather than timbral properties, whereas the reverse is true for the periodicity pitch representation. While tonal space variables explained more of the variation in RTs than did periodicity pitch variables, suggesting a greater contribution of cognitive influences to tonal expectation, a stepwise selection model contained variables from both representations and successfully explained the pattern of RTs across stimulus categories in 4 of the 7 experiments. The addition of closure--a cognitive representation of a specific syntactic relationship--succeeded in explaining results from all 7 experiments. We conclude that multiple representational stages along a sensory-cognitive continuum combine to shape tonal expectations in music.
Read full abstract