Abstract

This paper explores how speech and action are coordinated in a web-based task undertaken by two high school students working collaboratively at the computer. The paper focuses on the coordination involved in the interactions between the two students and the computer screen, keyboard, and mouse, and explores the temporal synchrony and ‘matching’ points between speaking and typing, and speaking and mouse movements, within and between participants. Examples include coordination of speaking words aloud whilst typing, coordination of reading aloud from the screen and mouse movements, and coordination between participants, as when one individual is typing and the other talking. The discussion draws on the literature describing the coordination of language and action, kinesic behaviour, and nonverbal communication, including gesture, which have the potential to mediate conversation. Results indicate most coordination of talk and action is at the beginning of the action. Sometimes work is done to ensure coordination, either by slowing down the talk or pausing or stretching sounds mid-utterance. Talk that is coordinated temporally to some action on the screen is precise; in other words even when action and talk are mismatched (e.g., she is not talking about what she is doing), talk and action can start and finish together.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.