Abstract
Modulation of auditory activity occurs before and during voluntary speech movement. However, it is unknown whether orofacial somatosensory input is modulated in the same manner. The current study examined whether or not the somatosensory event-related potentials (ERPs) in response to facial skin stretch are changed during speech and nonspeech production tasks. Specifically, we compared ERP changes to somatosensory stimulation for different orofacial postures and speech utterances. Participants produced three different vowel sounds (voicing) or non-speech oral tasks in which participants maintained a similar posture without voicing. ERP’s were recorded from 64 scalp sites in response to the somatosensory stimulation under six task conditions (three vowels × voicing/posture) and compared to a resting baseline condition. The first negative peak for the vowel /u/ was reliably reduced from the baseline in both the voicing and posturing tasks, but the other conditions did not differ. The second positive peak was reduced for all voicing tasks compared to the posturing tasks. The results suggest that the sensitivity of somatosensory ERP to facial skin deformation is modulated by the task and that somatosensory processing during speaking may be modulated differently relative to phonetic identity.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have