Our actions shape our everyday experience: what we experience, how we perceive, and remember it are deeply affected by how we interact with the world. Performing an action to deliver a stimulus engages neurophysiological processes which are reflected in the modulation of sensory and pupil responses. We hypothesized that these processes shape memory encoding, parsing the experience by grouping self- and externally generated stimuli into differentiated events. Participants encoded sound sequences, in which either the first or last few sounds were self-generated and the rest externally generated. We tested recall of the sequential order of sounds that had originated from the same (within event) or different sources (across events). Memory performance was not higher for within-event sounds, suggesting that actions did not structure the memory representation. However, during encoding, we observed the expected electrophysiological response attenuation for self-generated sounds, together with increased pupil dilation triggered by actions. Moreover, at the boundary between events, physiological responses to the first sound from the new source were influenced by the direction of the source switch. Our results suggest that introducing actions creates a stronger contextual shift than removing them, even though actions do not directly contribute to memory performance. This study contributes to our understanding of how interacting with sensory input shapes experiences by exploring the relationships between action effects on sensory responses, pupil dilation, and memory encoding. Importantly, it challenges the notion of a meaningful contribution from low-level neurophysiological mechanisms associated with action execution in the modulation of the self-generation effect.
Read full abstract