This research delves into the realm of simultaneous interpreting (SI) with a focus on the Portuguese-Chinese language pair, examining the interplay between visual inputs and cognitive load. This study posits that visual cues such as hand gestures may influence the cognitive load during SI, a topic that remains controversial in interpreting studies. To address this, we conducted an empirical study involving 18 trainee interpreters divided into two groups: a control group receiving only audio input and an experimental group with additional video input. Utilizing ELAN 6.3 software, we analyzed silent pauses exceeding 300ms to gauge the cognitive load. The research focused on how audio and video inputs impact these silent pauses, with a special emphasis on segments accompanied by semantically related hand gestures. The results revealed that the average duration of silent pauses was marginally shorter for interpreters with video input, although the differences between the two groups were not statistically significant. Intriguingly, for both groups, the duration of pauses significantly increased during segments with semantically related gestures, underscoring the inherent high cognitive demand of these segments, irrespective of visual input. A notable discovery was the marked increase in fluency for participants with visual access when interpreting segments accompanied by gestures, which suggests that semantically related gestures provide cognitive benefits. Overall, this study contributes to the ongoing discourse on the role of visual inputs in SI, highlighting the potential of gesture input to alleviate cognitive load and improve interpreter performance.
Read full abstract