Abstract

Recent work on integration of auditory and visual information during speech perception has indicated that adults are surprisingly good at, and rely extensively on, lip reading. The conceptual status of lip read information is of interest: such information is at the same time both visual and phonological. Three experiments investigated the nature of short term coding of lip read information in hearing subjects. The first experiment used asynchronous visual and auditory information and showed that a subject's ability to repeat words, when heard speech lagged lip movements, was unaffected by the lag duration, both quantitatively and qualitatively. This suggests that lip read information is immediately recoded into a durable code. An experiment on serial recall of lip read items showed a serial position curve containing a recency effect (characteristic of auditory but not visual input). It was then shown that an auditory suffix diminishes the recency effect obtained with lip read stimuli. These results are consistent with the hypothesis that seen speech, that is not heard, is encoded into a durable code which has some shared properties with heard speech. The results of the serial recall experiments are inconsistent with interpretations of the recency and suffix effects in terms of precategorical acoustic storage, for they demonstrate that recency and suffix effects can be supra-modal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call