Abstract

Experiencing a stimulus in one sensory modality is often associated with an experience in another sensory modality. For instance, seeing a lemon might produce a sensation of sourness. This might indicate some kind of cross-modal correspondence between vision and gustation. The aim of the current study was to explore whether such cross-modal correspondences influence cross-modal integration during perceptual learning. To that end, we conducted two experiments. Using a speeded classification task, Experiment 1 established a cross-modal correspondence between visual lightness and the frequency of an auditory tone. Using a short-term priming procedure, Experiment 2 showed that manipulation of such cross-modal correspondences led to the creation of a crossmodal unit regardless of the nature of the correspondence (i.e., congruent, Experiment 2a or incongruent, Experiment 2b). However, a comparison of priming effects sizes suggested that cross-modal correspondences modulate cross-modal integration during learning, leading to new learned units that have different stability over time. We discuss the implications of our results for the relation between cross-modal correspondence and perceptual learning in the context of a Bayesian explanation of cross-modal correspondences.

Highlights

  • Perception allows us to interact with and learn from our environment

  • These results are consistent with the idea that participants performed the gray discrimination task accurately, and the systematic association between a sound and a shade of gray does not impact the visual nature of the task

  • Separated mixed analyses of variance were performed on latencies (RT) and correct responses (CRs) rates with subject as a random variable, Tone Frequency (Low-Pitched vs. High-Pitched), and PrimeType (Light vs. Dark) as within-subject variables, and stimulus onset asynchrony (SOA) (100 ms vs. 500 ms) as a between-subjects variable

Read more

Summary

Introduction

Perception allows us to interact with and learn from our environment. Perception can be envisaged as an interface between a cognitive agent and its environment. Processing a situation may require integrating information from all of our senses as well as background contextual knowledge in order to reduce the complexity and the instability of the situation. What we call a “conscious experience” of a situation should involve an integration of both a particular state of the cognitive system generated by the current situation (i.e., perceptual state) and former cognitive states (i.e., memory state). Integration should be a relevant mechanism for both perceptual and memory processes (see Brunel et al, 2009).

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call