Abstract

This research is a part of a broader project exploring how movement qualities can be recognized by means of the auditory channel: can we perceive an expressive full-body movement quality by means of its interactive sonification? The paper presents a sonification framework and an experiment to evaluate if embodied sonic training (i.e., experiencing interactive sonification of your own body movements) increases the recognition of such qualities through the auditory channel only, compared to a non-embodied sonic training condition. We focus on the sonification of two mid-level movement qualities: fragility and lightness. We base our sonification models, described in the first part, on the assumption that specific compounds of spectral features of a sound can contribute to the cross-modal perception of a specific movement quality. The experiment, described in the second part, involved 40 participants divided into two groups (embodied sonic training vs. no training). Participants were asked to report the level of lightness and fragility they perceived in 20 audio stimuli generated using the proposed sonification models. Results show that (1) both expressive qualities were correctly recognized from the audio stimuli, (2) a positive effect of embodied sonic training was observed for fragility but not for lightness. The paper is concluded by the description of the artistic performance that took place in 2017 in Genoa (Italy), in which the outcomes of the presented experiment were exploited.

Highlights

  • Interactive sonification of human movement has been receiving growing interest from both researchers and industry

  • The distributions are skewed because people tended to answer “very high” or “absent”

  • We presented an experiment to evaluate the impact of sonic versus non-sonic embodied training in the recognition of two expressive qualities only by the auditory channel through their sonifications

Read more

Summary

Introduction

Interactive sonification of human movement has been receiving growing interest from both researchers and industry (e.g., see [14,22], and the ISon Workshop series). The work presented in this paper was part of the European Union H2020 ICT Dance Project,1which aimed at developing techniques for the real-time analysis of movement qualities and their translation to the auditory channel. Applications of the project’s outcome include systems for visually impaired and blind-folded people allowing them to “see” the qualities of movement through the auditory channel. Dance adopted a participative interaction design involving artists,

Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.