Abstract

AbstractBackgroundHumans often use co‐speech gestures to promote effective communication. Attention has been paid to the cortical areas engaged in the processing of co‐speech gestures.AimsTo investigate the neural network underpinned in the processing of co‐speech gestures and to observe whether there is a relationship between areas involved in language and gesture processing.Methods & ProceduresWe planned to include studies with neurotypical and/or stroke participants who underwent a bimodal task (i.e., processing of co‐speech gestures with relative speech) and a unimodal task (i.e., speech or gesture alone) during a functional magnetic resonance imaging (fMRI) session. After a database search, abstract and full‐text screening were conducted. Qualitative and quantitative data were extracted, and a meta‐analysis was performed with the software GingerALE 3.0.2, performing contrast analyses of uni‐ and bimodal tasks.Main ContributionThe database search produced 1024 records. After the screening process, 27 studies were included in the review. Data from 15 studies were quantitatively analysed through meta‐analysis. Meta‐analysis found three clusters with a significant activation of the left middle frontal gyrus and inferior frontal gyrus, and bilateral middle occipital gyrus and inferior temporal gyrus.ConclusionsThere is a close link at the neural level for the semantic processing of auditory and visual information during communication. These findings encourage the integration of the use of co‐speech gestures during aphasia treatment as a strategy to foster the possibility to communicate effectively for people with aphasia.WHAT THIS PAPER ADDSWhat is already known on this subject Gestures are an integral part of human communication, and they may have a relationship at neural level with speech processing.What this paper adds to the existing knowledge During processing of bi‐ and unimodal communication, areas related to semantic processing and multimodal processing are activated, suggesting that there is a close link between co‐speech gestures and spoken language at a neural level.What are the potential or actual clinical implications of this work? Knowledge of the functions related to gesture and speech processing neural networks will allow for the adoption of model‐based neurorehabilitation programs to foster recovery from aphasia by strengthening the specific functions of these brain networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call