Abstract
It is now widely accepted that hand gestures help people understand and learn language. Here, we provide an exception to this general rule—when phonetic demands are high, gesture actually hurts. Native English-speaking adults were instructed on the meaning of novel Japanese word pairs that were for non-native speakers phonetically hard (/ite/ vs. /itte/, which differ by only a geminate) or easy (/tate/ vs. /butta/, which differ by a geminate and also their segmental composition). The words were presented either with or without congruent iconic gestures, for example, “Ite means stay” (with a STAY gesture). After instruction, participants were given phonetic and vocabulary tests for the words they had learned. Although performance for the phonetic task was above chance for all conditions, gesture played different roles in the semantic task for easy and hard word pairs—it helped word learning for easy pairs, but it hurt for hard pairs. These results suggest that gesture and speech are semantically integrated during word learning, but only when phonetic demands are not too high.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.