Abstract

Understanding actions based on either language or observation of gestures is presumed to involve the motor system, and reflect the engagement of an embodied conceptual network. The role of the left inferior frontal gyrus (IFG) in language tasks is well established, but the role of the right hemisphere is unclear with some imaging evidence suggesting right IFG activation when gestures mismatch speech. Using transcranial direct current stimulation (tDCS), we explored the hemispheric asymmetries in the assumed cognitive embodiment required for gestural-verbal integration. Symbolic gestures served as primes for verbal targets. Primes were clips of symbolic gestures taken from a rich set of emblems and pantomimes. Participants responded by performing a semantic relatedness-judgment under 3 stimulation conditions - anodal tDCS (atDCS) over the left IFG, atDCS over the right IFG, and sham. There was also a non-semantic control task of attentional load. AtDCS of the right IFG generated faster responses to symbolic gestures than atDCS over the left IFG or sham stimulation. For the attentional load task, no differences were observed across the three stimulation conditions. These results support a right-lateralization bias of the human mirror neuron system in processing gestural-verbal stimuli. Gesture comprehension may be enhanced by improved gesture and language integration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call