Musical performance has traditionally been understood as an auditory experience, with emotions conveyed primarily through sound. However, emerging research highlights the role of physical movement—gestures, posture, and body language—in enhancing emotional expression and creating a deeper connection between performers and their audiences. While existing studies have explored some aspects of this relationship, there is limited understanding of how specific movements modulate emotional responses across different musical genres. This study addresses these gaps by investigating the interaction between human motion and musical expression, focusing on the emotional resonance experienced by performers and audience members. The need for this study arises from the lack of comprehensive data on the correlation between movement and emotional engagement during musical performances, particularly concerning different genres. Using motion capture, biometric sensors, and facial recognition technology, the research analyzed the performances of professional and amateur musicians across classical, jazz, and contemporary music. Audience members’ emotional responses were captured through physiological data and post-performance surveys. One limitation of this study is that it focused solely on live performance environments, leaving digital or virtual performances unexplored. Key findings reveal that expressive movements increased emotional intensity ratings by 1.6 points (p < 0.001), with heart rate rising by 6.9 BPM (p < 0.0012) and electrodermal activity increasing by 1.6 µS (p < 0.0008). Movement synchronization with musical elements, especially during climaxes, strongly correlated with heightened emotional responses (r = 0.85, p < 0.01). Classical performances exhibited the highest synchronization between movement and emotion, with audience emotional intensity peaking at 8.3 on a Likert scale.
Read full abstract