Abstract

Crossmodal associations may arise at neurological, perceptual, cognitive, or emotional levels of brain processing. Higher-level modal correspondences between musical timbre and visual colour have been previously investigated, though with limited sets of colour. We developed a novel response method that employs a tablet interface to navigate the CIE Lab colour space. The method was used in an experiment where 27 film music excerpts were presented to participants (n = 22) who continuously manipulated the colour and size of an on-screen patch to match the music. Analysis of the data replicated and extended earlier research, for example, that happy music was associated with yellow, music expressing anger with large red colour patches, and sad music with smaller patches towards dark blue. Correlation analysis suggested patterns of relationships between audio features and colour patch parameters. Using partial least squares regression, we tested models for predicting colour patch responses from audio features and ratings of perceived emotion in the music. Parsimonious models that included emotion robustly explained between 60% and 75% of the variation in each of the colour patch parameters, as measured by cross-validated R 2. To illuminate the quantitative findings, we performed a content analysis of structured spoken interviews with the participants. This provided further evidence of a significant emotion mediation mechanism, whereby people tended to match colour association with the perceived emotion in the music. The mixed method approach of our study gives strong evidence that emotion can mediate crossmodal association between music and visual colour. The CIE Lab interface promises to be a useful tool in perceptual ratings of music and other sounds.

Highlights

  • Crossmodal associationWhen associating colour with music, natural soundscapes, or soundscape compositions, do people use different strategies? The question of how associations between visual and auditivePLOS ONE | DOI:10.1371/journal.pone.0144013 December 7, 2015Colour Association with Music modes of perception emerge has been scientifically investigated for more than a hundred years (e.g. [1])

  • We formulated three testable questions: Do people associate different colours with music expressive of discrete emotions? Do colour associations align with perceived dimensional emotions in music? Do men and women differ in colour patch association with music? we explored the extent to which colour association could be explained by computationally extracted audio features and emotion ratings of the music: whether emotion would contribute to predicting colour over and above audio features

  • We investigated the role of emotion as a mediating variable between audio features and colour patch parameters by comparing different models using partial least squares regression (PLS; [37], as implemented in [38])

Read more

Summary

Introduction

Crossmodal associationWhen associating colour with music, natural soundscapes, or soundscape compositions, do people use different strategies? The question of how associations between visual and auditivePLOS ONE | DOI:10.1371/journal.pone.0144013 December 7, 2015Colour Association with Music modes of perception emerge has been scientifically investigated for more than a hundred years (e.g. [1]). When associating colour with music, natural soundscapes, or soundscape compositions, do people use different strategies? Colour Association with Music modes of perception emerge has been scientifically investigated for more than a hundred years It is generally understood that while some aspects of crossmodal correspondences might have a psychobiological basis [2], other patterns of association might depend on gender [3], be acquired by the individual [4], or defined culturally [5]. The literature identifies at least four mechanisms whereby crossmodal association can emerge, roughly corresponding to levels of neural processing in the brain. An overview is given in Spence’s tutorial [2], which focusses on three classes of crossmodal correspondence: structural, statistical, and semantic.

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.