Abstract

Lip-reading and interpreting hand gestures help provide nonverbal information that aids speech comprehension in noisy environments and places emphasis on certain key utterances. In this fMRI study, we examined if viewing the similar semantic information presented by either finger movements or lip movements was processed by common or discrete brain regions. Subjects viewed videos of a hand conveying number information via finger movements and a face whose lip movements conveyed the same numerical information. Control stimuli consisted of meaningless finger and lip movements. Lip-reading numbers activated left posterior superior temporal sulcus (STS), while identifying numbers presented by fingers activated the intraparietal region (IPR) bilaterally. Conjunction analysis highlighted common activation in right IPR to numbers presented via fingers and lips. Our data indicate that left hemisphere decodes human movements conveying semantic information, although the specific brain region that is engaged may depend on the body part that is moving.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call