Abstract

Emerging technologies in the domain of mixed reality offer rich, new possibilities for the study and practice of joint music performance. Apart from the technological challenges, bringing music players together in mixed reality raises important questions on their performance and embodied coordination. In this study, we designed a mixed reality platform to assess a remote, bidirectional polyrhythmic interaction between two players, mediated in real time by their three-dimensional embodied avatars and a shared, virtual ‘drum circle’. We leveraged a multi-layered analysis framework to assess their performance quality, embodied coregulation and first-person interaction experience, using statistical techniques for timeseries analysis and mixed-effect regression and focusing on contrasts of visual coupling (not seeing / seeing as avatars / seeing as real) and auditory context (metronome / music). Results reveal that an auditory context with music improved the performance output as measured by a prediction error, increased movement energy and levels of experienced agency. Visual coupling impacted experiential qualities and induced prosocial effects with increased levels of partner realism resulting in increased levels of shared agency and self-other merging. Embodied coregulation between players was impacted by auditory context and visual coupling, suggesting prediction-based compensatory mechanisms to deal with the novelty, difficulty, and expressivity in the musical interaction. This study contributes to the understanding of music performance in mixed reality by using a methodological approach to demonstrate how coregulation between players is impacted by visual coupling and auditory context and provides a basis and future directions for further action-oriented research.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call