The relation between speech and gestures is a topic of vigorous research in language comprehension and production. Recent studies indicate that speech and gestures are integrated in language comprehension (Kelly, Özyürek, & Maris, 2010) with mutual and obligatory interactions (Kita & Özyürek, 2003). This hypothesis is also supported by neurophysiological findings on the integration of speech and gestures (Willems, Özyürek, & Hagoort, 2007). In the present study, using two behavioral tasks, we further investigated the integration of speech and gesture by focusing on the ‘unity effect’ (the perception of two or more sensory signals with spatiotemporal contiguity and semantic/informational relatedness is unified into a multisensory whole; e.g., Vatakis & Spence, 2007) from the multisensory literature. In particular, we manipulated the levels of semantic relatedness between speech and gestures. In Experiment 1, we presented a series of dynamic iconic gestures (i.e., “cut”, “saw”, “write”) with the respective auditory speech. We manipulated semantic congruency through three conditions: congruent, weakly incongruent (e.g., “cut” with “saw”), and strongly incongruent (e.g., “cut” or “saw” with “write”). Participants performed a speeded discrimination task of unimodal (audio or visual) and multimodal conditions by detection of two target gestures (e.g., respond when you see, hear, or both “cut” and “saw”; all stimulus combinations where tested in a blocked format). Analysis of the reaction times (RTs) revealed shorter RTs, as expected, for the multimodal congruent cases as compared to the unimodal and incongruent conditions presented. In Experiment 2, participants performed an unspeeded temporal order judgment task (TOJ). For the TOJ task the above mentioned conditions were presented with a set of stimulus onset ansynchronies (±541, ±333, ±208, ±125, and 0ms). Analysis of the just noticeable difference (JND) showed higher JNDs and, thus, worse performance for the congruent as compared to the incongruent condition. Both experiments, therefore, confirmed, for the first time, the unity effect in speech-gesture pairings, as well as the facilitation effect of semantic relatedness on the integration of speech and gestures. Overall, the present study confirms the hypothesis of integration between speech and iconic gestures, opening new pathways to research on multisensory and temporal relationships between the two communicative signals.
Read full abstract