Abstract

Our daily interactions and perceptions involve multiple sensory multimodalities; most exchanges are inherently multimodal, playing an important part in our decoding and understanding of environments. Events can often engage multiple senses. Furthermore, musical experiences can be highly multisensory, with obvious auditory stimulation, the visual elements of a live performance, and physical stimulation of the body. In this paper, we will propose a means of incorporating an additional somatic channel of communication into live performances and compositional practice to further augment the physical nature of live performance. This work explores the integration of augmented vibratory, or haptic stimulation for audiences in live performance. The vibration interface is presented as an expressive and creative live performance-based tool for composers. Vibrations, or haptics, are implemented as an additional instrumental line, alongside auditory musical gestures, to expand the composer’s palette of expressions through augmented somatic engagement. This paper will describe the background, and design and development of a haptic interface for the purpose of audio-haptic listening-feeling. The focus of this paper is to describe a study into motor latency for informing multimedia simultaneity.

Highlights

  • In a previous EVA paper (Armitage & Ng 2013), we proposed a multi-actuator interface for use in compositional and performance practice

  • The paper focussed on design requirements and preliminary designs of the haptic interface and how it fitted within the larger remit of the Interdisciplinary Centre for Scientific Research in Music (ICSRiM) research concepts and wider work in multimodality in music practice

  • An appreciation of system latency and jitter is required to firstly, synchronise fixed media haptic and auditory stimuli by adding latency to the audio system, and secondly to allow an appreciation of whether the haptic interface could function in realtime applications

Read more

Summary

Introduction

In a previous EVA paper (Armitage & Ng 2013), we proposed a multi-actuator interface for use in compositional and performance practice. An appreciation of system latency and jitter is required to firstly, synchronise fixed media haptic and auditory stimuli by adding latency to the audio system, and secondly to allow an appreciation of whether the haptic interface could function in realtime applications. These results provide a test-bed for further investigations into the human perception of haptic-latency in time-based applications

Objectives
Findings
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.