Abstract

In this paper we present an experiment whose goal is to investigate subjects’ ability to match pairs of synthetic auditory and haptic stimuli which simulate the sensation of walking on different surfaces. In three non-interactive conditions the audio–haptic stimuli were passively presented through a desktop system, while in three interactive conditions participants produced the audio–haptic feedback interactively while walking. Results show that material typology (i.e., solid or aggregate) is processed very consistently in both the auditory and haptic modalities. Subjects expressed a higher level of semantic congruence for those audio–haptic pairs of materials which belonged to the same typology. Furthermore, better matching ability was found for the passive case compared to the interactive one, although this may be due to the limits of the technology used for the interactive haptic simulations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.