Abstract

The observation of other people’s actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during non-linguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people’s actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.

Highlights

  • Action understanding supports the interpretation of others’ goals, intentions, and reasons (Brass et al, 2007)

  • Arm actions evoked a greater activation than leg actions in left anterior inferior parietal gyrus (aIPL), whereas we found the opposite pattern in the right pSTS

  • When the deaf individuals were divided into groups with early or late CSL acquisition, no difference was observed between the two groups in any Regions of interest (ROI) (Bonferroni corrected P < 0.05). We used both an action judgment task and a passive action observation task to investigate whether the processing of arm- and leg-related actions is affected by auditory experience deprivation and/or sign language experience

Read more

Summary

Introduction

Action understanding supports the interpretation of others’ goals, intentions, and reasons (Brass et al, 2007). We can understand actions presented from both visual and auditory inputs, with potential interactions between the two modalities. Thomas and Shiffrar (2010) found that detection sensitivity improved when point-light displays of human actions were paired with veridical auditory cues (footsteps) but not when paired with simple tones. The human mirror system (hMS), consisting of the posterior inferior frontal gyrus (IFG), and the inferior parietal lobule (IPL), and the superior temporal sulcus (STS) have been consistently suggested to play a crucial role in action understanding The hMS has been reported to be activated when observing actions and when listening to action-related sounds (Lewis et al, 2005; Gazzola et al, 2006; Lahav et al, 2007)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call