Abstract

In three experiments, we investigated the influence of object-specific sounds on haptic scene recognition without vision. Blindfolded participants had to recognize, through touch, spatial scenes comprising six objects that were placed on a round platform. Critically, in half of the trials, object-specific sounds were played when objects were touched (bimodal condition), while sounds were turned off in the other half of the trials (unimodal condition). After first exploring the scene, two objects were swapped and the task was to report, which of the objects swapped positions. In Experiment 1, geometrical objects and simple sounds were used, while in Experiment 2, the objects comprised toy animals that were matched with semantically compatible animal sounds. In Experiment 3, we replicated Experiment 1, but now a tactile-auditory object identification task preceded the experiment in which the participants learned to identify the objects based on tactile and auditory input. For each experiment, the results revealed a significant performance increase only after the switch from bimodal to unimodal. Thus, it appears that the release of bimodal identification, from audio-tactile to tactile-only produces a benefit that is not achieved when having the reversed order in which sound was added after having experience with haptic-only. We conclude that task-related factors other than mere bimodal identification cause the facilitation when switching from bimodal to unimodal conditions.

Highlights

  • As we navigate through our environment, the retinal image is constantly changing; while some objects come into our field of view, others fade out

  • We investigate whether haptic scene recognition can be influenced by object sounds that are played simultaneously with haptic exploration of the individual objects in the scene

  • We investigated whether sounds may facilitate recognition of haptic scenes when sounds are object-specific and presented at the moment the objects are touched

Read more

Summary

Introduction

As we navigate through our environment, the retinal image is constantly changing; while some objects come into our field of view, others fade out. Perception seems to be shaped by complex interactions between different sensory modalities (Eimer, 2004; Spence, 2007). When we are deprived of vision and we have to rely on the remaining senses such as hearing and touch, such interactions between intact senses may become even more relevant (Hotting & Roder, 2009). We investigate whether haptic scene recognition can be influenced by object sounds that are played simultaneously with haptic exploration of the individual objects in the scene

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call