Abstract
Research into the ability to coordinate one’s movements with external cues has focussed on the use of simple rhythmic, auditory and visual stimuli, or interpersonal coordination with another person. Coordinating movements with a virtual avatar has not been explored, in the context of responses to temporal cues. To determine whether cueing of movements using a virtual avatar is effective, people’s ability to accurately coordinate with the stimuli needs to be investigated. Here we focus on temporal cues, as we know from timing studies that visual cues can be difficult to follow in the timing context. Real stepping movements were mapped onto an avatar using motion capture data. Healthy participants were then motion captured whilst stepping in time with the avatar’s movements, as viewed through a virtual reality headset. The timing of one of the avatar step cycles was accelerated or decelerated by 15% to create a temporal perturbation, for which participants would need to correct to, in order to remain in time. Step onset times of participants relative to the corresponding step-onsets of the avatar were used to measure the timing errors (asynchronies) between them. Participants completed either a visual-only condition, or auditory-visual with footstep sounds included, at two stepping tempo conditions (Fast: 400ms interval, Slow: 800ms interval). Participants’ asynchronies exhibited slow drift in the Visual-Only condition, but became stable in the Auditory-Visual condition. Moreover, we observed a clear corrective response to the phase perturbation in both the fast and slow tempo auditory-visual conditions. We conclude that an avatar’s movements can be used to influence a person’s own motion, but should include relevant auditory cues congruent with the movement to ensure a suitable level of entrainment is achieved. This approach has applications in physiotherapy, where virtual avatars present an opportunity to provide the guidance to assist patients in adhering to prescribed exercises.
Highlights
Mirroring of movements has been described as an important aspect of social interactions[1], as well as providing a top-down process that allows humans to anticipate actions and their goals when interacting with others[2]
Existing timing research in the visual domain has currently focussed on human-human interaction or very simple artificial cues, such as flashing lights or moving dots; coordinating movements with a virtual avatar has not been explored in the context of responses to temporal cues
The use of visual stepping stones on a treadmill revealed that corrections to visual cues were faster than auditory cues, indicating gait may be more strongly influenced by visual cues in certain contexts[6]
Summary
Mirroring of movements has been described as an important aspect of social interactions[1], as well as providing a top-down process that allows humans to anticipate actions and their goals when interacting with others[2] This process involves the mapping of the visual information about an agent’s movement on to one’s own interpreted motor representation[3]. Existing timing research in the visual domain has currently focussed on human-human interaction or very simple artificial cues, such as flashing lights or moving dots; coordinating movements with a virtual avatar has not been explored in the context of responses to temporal cues. We hypothesise that a virtual reality avatar could provide effective cues for lower limb timing coordination, in the form of stepping actions
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.