Abstract

The influence of action-effect integration on motor control and sensory processing is often investigated in arrangements featuring human-machine interactions. Such experiments focus on predictable sensory events produced through participants' interactions with simple response devices. Action-effect integration may, however, also occur when we interact with human partners. The current study examined the similarities and differences in perceptual and motor control processes related to generating sounds with or without the involvement of a human partner. We manipulated the complexity of the causal chain of events between the initial motor and the final sensory event. In the self-induced condition participants generated sounds directly by pressing a button, while in the interactive condition sounds resulted from a paired reaction-time task, that is, the final sound was generated indirectly, by relying on the contribution of the partner.Auditory event-related potentials (ERPs) and force application patterns were similar in the two conditions, suggesting that social action effects produced with the involvement of a second human agent in the causal sequence are processed, and utilized as action feedback in the same way as direct consequences of one's actions. The only reflection of a processing difference between the two conditions was a slow, posterior ERP waveform that started before the presentation of the auditory stimulus, which may reflect differences in stimulus expectancy or task difficulty.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call