Abstract

Perceptual learning and contextual learning are two types of implicit visual learning that can co-occur in the same tasks. For example, to find an animal in the woods, you need to know where to look in the environment (contextual learning) and you must be able to discriminate its features (perceptual learning). However, contextual and perceptual learning are typically studied using distinct experimental paradigms, and little is known regarding their comparative neural mechanisms. In this study, we investigated contextual and perceptual learning in 12 healthy adult humans as they performed the same visual search task, and we examined psychophysical and electrophysiological (event-related potentials) measures of learning. Participants were trained to look for a visual stimulus, a small line with a specific orientation, presented among distractors. We found better performance for the trained target orientation as compared to an untrained control orientation, reflecting specificity of perceptual learning for the orientation of trained elements. This orientation specificity effect was associated with changes in the C1 component. We also found better performance for repeated spatial configurations as compared to novel ones, reflecting contextual learning. This context-specific effect was associated with the N2pc component. Taken together, these results suggest that contextual and perceptual learning are distinct visual learning phenomena that have different behavioral and electrophysiological characteristics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call