Abstract

Learners need to know a considerable number of words to function in a second or foreign language. To help increase their word knowledge, learners are encouraged to engage in activities that provide a rich source of vocabulary like listening to music and audio books, and watching films, television, and video. In many of these types of activities, learners can listen to and read ‘matched’ content (i.e., text is both written and aural). For example, viewing television programs and films is often accompanied by subtitles that closely adhere to the auditory input. While reading and listening to matched content may be a fairly common experience, we have little understanding of how comprehenders process the two sources of information, nor how the addition of audio changes word reading or might impact word learning. Eye-tracking provides a means of measuring the effort associated with processing words, yet very few studies have explicitly investigated written-word processing while listening and even fewer have examined this in the context of word learning. The technology allows researchers to synchronize eye-movements in reading to an auditory text, but requires technical know-how. The goal of this research methods paper is to provide methodological and technical guidance on the use of eye-tracking in reading-while-listening with an emphasis on investigating vocabulary learning and processing.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.