Abstract

When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.

Highlights

  • Researchers have recently focused on different modes of non-verbal communication concerning interpersonal coordination as a basis of social interaction (Vicaria and Dickens, 2016)

  • Sport, music, and computer-game-expertise, as well as pretest performance were taken into account because they could influence performance in the tetherball paradigm. Comparing these variables of groups with those of the VFG, we found no significant differences in these variables except for sport expertise between the VFG and PAF group (PAFG) [F(1, 16) = 6.38, p = 0.022, ηp2 = 0.29]

  • Results demonstrate that error reduction was faster with effectbased auditory feedback (EAF) and combined EAF and PAF (CAF) than the visual condition; no statistical difference was observed with performance-based auditory feedback (PAF)

Read more

Summary

Introduction

Researchers have recently focused on different modes of non-verbal communication concerning interpersonal coordination (e.g., mimicry, gestures, and facial expressions) as a basis of social interaction (Vicaria and Dickens, 2016). These kinds of nonverbal behavior can cause spatiotemporal coordination and support affective entrainment between two or more individuals (Phillips-Silver and Keller, 2012). Waterhouse et al (2014) reported that two dancers nonverbally coordinated during their choreography performance They synchronized the same movements or aligned the onset of different movements, relying on visual cues from their body movement as well as on auditory cues from breath and stepping sounds

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call