Abstract

SummaryTo behave adaptively with sufficient flexibility, biological organisms must cognize beyond immediate reaction to a physically present stimulus. For this, humans use visual mental imagery [1, 2], the ability to conjure up a vivid internal experience from memory that stands in for the percept of the stimulus. Visually imagined contents subjectively mimic perceived contents, suggesting that imagery and perception share common neural mechanisms. Using multivariate pattern analysis on human electroencephalography (EEG) data, we compared the oscillatory time courses of mental imagery and perception of objects. We found that representations shared between imagery and perception emerged specifically in the alpha frequency band. These representations were present in posterior, but not anterior, electrodes, suggesting an origin in parieto-occipital cortex. Comparison of the shared representations to computational models using representational similarity analysis revealed a relationship to later layers of deep neural networks trained on object representations, but not auditory or semantic models, suggesting representations of complex visual features as the basis of commonality. Together, our results identify and characterize alpha oscillations as a cortical signature of representations shared between visual mental imagery and perception.

Highlights

  • How do neural representations shared between imagery and perception emerge? Unlike perception, imagery lacks feedforward information flow from the stimulus, suggesting that neural representations shared between imagery and perception emerge through feedback information flow

  • Feedforward and feedback information in the visual brain are carried by different neural oscillation channels: theta and gamma oscillations carry feedforward information, and alpha and beta oscillations carry feedback information [12, 13]

  • Imagery and Perception Share Neural Dynamics in the Alpha Frequency Band The key result is that imagery and perception share neural dynamics in the alpha frequency band (Figure 1F; for timing, see figure caption), but not in the theta or beta frequency band (Figures S1A–S1C), even though we found strong object

Read more

Summary

Introduction

How do neural representations shared between imagery and perception emerge? Unlike perception, imagery lacks feedforward information flow from the stimulus, suggesting that neural representations shared between imagery and perception emerge through feedback information flow. For each frequency of interest (Figure 1D), this resulted in a two-dimensional classification accuracy matrix identifying time-point combinations, during which neural representations are similar between imagery and perception (Figure 1E). Our finding that imagery and perception share representations in the alpha frequency band from parieto-occipital sources has two implications.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call