Abstract

To investigate the differences in communicative activities by the same interlocutors in Japanese (their L1) and in English (their L2), an 8-h multimodal corpus of multiparty conversations was collected. Three subjects participated in each conversational group, and they had conversations on free-flowing and goal-oriented topics in Japanese and in English. Their utterances, eye gazes, and gestures were recorded with microphones, eye trackers, and video cameras. The utterances and eye gazes were manually annotated. Their utterances were transcribed, and the transcriptions of each participant were aligned with those of the others along the time axis. Quantitative analyses were made to compare the communicative activities caused by the differences in conversational languages, the conversation types, and the levels of language expertise in L2. The results reveal different utterance characteristics and gaze patterns that reflect the differences in difficulty felt by the participants in each conversational condition. Both total and average durations of utterances were shorter in their L2 than in their L1 conversations. Differences in eye gazes were mainly found in those toward the information senders: Speakers were gazed at more in their second-language than in their native-language conversations. Our findings on the characteristics of conversations in the second language suggest possible directions for future research in psychology, cognitive science, and human–computer interaction technologies.

Highlights

  • In typical human–human interactions, the interlocutors use speech and language and a wide variety of paralinguistic means and nonverbal behaviors to signal their speaking intentions to the partner, to express intimacy, and to coordinate their conversation (Argyle et al 1968; Beattie 1978, 1980; Clark 1996; Kendon 1967; Kleinke 1986; Mehrabian and Wiener 1967; Mehrabian and Ferris 1967)

  • Many quantitative studies on human–human interaction have reported that eye gaze plays an important role in monitoring conversation content and contributes to the performance of collaborative tasks requiring the understanding of communication partners (Boyle et al 1994; Clark and Krych 2004; Jokinen et al 2013)

  • We collected an 8-hour multimodal corpus of multiparty conversations to investigate the differences in communicative activities by the same interlocutors in Japanese and in English

Read more

Summary

Introduction

In typical human–human interactions, the interlocutors use speech and language and a wide variety of paralinguistic means and nonverbal behaviors to signal their speaking intentions to the partner, to express intimacy, and to coordinate their conversation (Argyle et al 1968; Beattie 1978, 1980; Clark 1996; Kendon 1967; Kleinke 1986; Mehrabian and Wiener 1967; Mehrabian and Ferris 1967). Many quantitative studies on human–human interaction have reported that eye gaze plays an important role in monitoring conversation content and contributes to the performance of collaborative tasks requiring the understanding of communication partners (Boyle et al 1994; Clark and Krych 2004; Jokinen et al 2013). These findings on human–human interactions were mainly obtained from conversations held in the mother tongue (L1).

Data collection
Experimental setup
Participants
Procedure
Annotation features
Gaze events
Transcription
Utterances
Analyses of utterances
Gaze events in speaking
Analyses of gaze events in speaking
Differences in utterances
Differences in eye gazes
Effect of expertise in L2
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call