Abstract

We present a method for the analysis of the finger-string interaction in guitar performances and the computation of fine actions during the plucking gesture. The method is based on Motion Capture using high-speed cameras that can track the position of reflective markers placed on the guitar and fingers, in combination with audio analysis. A major problem inherent in optical motion capture is that of marker occlusion and, in guitar playing, it is the right hand of the guitarist that is extremely difficult to capture, especially during the plucking process, where the track of the markers at the fingertips is lost very frequently. This work presents two models that allow the reconstruction of the position of occluded markers: a rigid-body model to track the motion of the guitar strings and a flexible-body model to track the motion of the hands. In combination with audio analysis (onset and pitch detection), the method can estimate a comprehensive set of sound control features that include the plucked string, the plucking finger and the characteristics of the plucking gesture in the phases of contact, pressure and release (e.g. position, timing, velocity, direction or string displacement).

Highlights

  • The interaction between a musician and a musical instrument determines the characteristics of the sound produced

  • In order to estimate the best fitting plane, we need at least three correctly identified markers, which is a reasonable condition since the first joints of each finger, the wrist, metacarpophalangeal joint (MCP) and proximal interphalangeal joint (PIP) are correctly identified in 99% of the frames

  • The method is based on motion capture and audio analysis

Read more

Summary

Introduction

The interaction between a musician and a musical instrument determines the characteristics of the sound produced. Understanding such interaction is a field with growing interest, driven in recent years by technological advances that have allowed the emergence of more accurate yet less expensive measuring devices. This interaction occurs with the left hand (fingering) and right hand (plucking). Fingering mainly determines the tone (pitch), while plucking determines the qualities of the sound. In this work we are focusing mainly on the former type (excitation gestures)

Objectives
Methods
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call