Abstract

Abstract The question whether nonconscious processing could involve higher-level, semantic representations is of broad interest. Here, we demonstrate semantic processing of task-relevant and task-irrelevant features of nonconscious primes within a novel, empirical test bed. In two experiments, musicians were visually primed with musical note triads varying in mode (i.e., major vs minor) and position (i.e., the arrangement of notes within a triad). The task required to discriminate only the mode in the following auditory target chord. In two experimental blocks, primes were either consciously visible or masked, respectively. Response times for auditory discrimination of the modes (relevant dimension) of heard triads were measured. Crucially, the targets also varied with respect to mode and position, creating different grades of congruency with the visual primes. Based on the Theory of Event Coding, we expected and found interactions between relevant and irrelevant semantic characteristics of masked primes, illustrating that even irrelevant prime meaning was processed. Moreover, our results indicated that both task-relevant and task-irrelevant prime characteristics are processed in nonconscious conditions only, and that practice in ignoring uninformative conscious primes can be transferred to a subsequent block. In conclusion, this study demonstrates cross-modal, automatic semantic processing using a novel approach to study such effects.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call