In everyday face-to-face communication, speakers use speech to transfer information and rely on co-occurring nonverbal cues, such as hand and facial gestures. The integration of speech and gestures facilitates both language comprehension and the skill of the theory of mind. Consecutive dialogue interpreting (DI) allows dyads of different linguistic backgrounds to communicate with each other. The interpreter interprets after the interlocutor has finished a turn, so the interlocutor watches the gesture first and hears the target language a few seconds later, resulting in speech-gesture asynchrony. In this study, we used the functional near-infrared spectroscopy hyperscanning technique to investigate the influence of speech-gesture asynchrony on different levels of communication. Twenty groups were recruited for the DI experiments. The results showed that when the interpreter performed consecutive interpreting, the time-lagged neural coupling at the temporoparietal junction decreased compared to simultaneous interpreting. It suggests that speech-gesture asynchrony significantly weakened the ability of interlocutors to understand each other's mental state, and the decreased neural coupling was significantly correlated with the interpreter's interpretation skill. In addition, the time-aligned neural coupling at the left inferior frontal gyrus increased, which suggests that, as compensation, the interlocutor's verbal working memory increases in line with the communication process.
Read full abstract