Abstract

Identifying an object’s material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be. Visual cues to material qualities such as gloss have recently received attention, but how they interact with haptic (touch) information has been largely overlooked. Here, we show that touch modulates gloss perception: objects that feel slippery are perceived as glossier (more shiny).Participants explored virtual objects that varied in look and feel. A discrimination paradigm (Experiment 1) revealed that observers integrate visual gloss with haptic information. Observers could easily detect an increase in glossiness when it was paired with a decrease in friction. In contrast, increased glossiness coupled with decreased slipperiness produced a small perceptual change: the visual and haptic changes counteracted each other. Subjective ratings (Experiment 2) reflected a similar interaction – slippery objects were rated as glossier and vice versa. The sensory system treats visual gloss and haptic friction as correlated cues to surface material. Although friction is not a perfect predictor of gloss, the visual system appears to know and use a probabilistic relationship between these variables to bias perception – a sensible strategy given the ambiguity of visual clues to gloss.

Highlights

  • Identifying an object’s material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be

  • We report our participants’ ability to discriminate between stimuli that differ in glossiness and rubberiness

  • We show that observers have learnt a statistical relationship between the material cues of visually-defined gloss and haptically-defined friction and that knowledge of this statistical relationship is reflected in the integration of these cues

Read more

Summary

Introduction

Identifying an object’s material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be. Little is known about how visual and haptic information interact when we estimate material properties, with existing research almost entirely constrained to the perception of roughness[12,13,14,15,16,17,18] with a few studies on the visual-haptic cues to compliance[19,20,21,22,23,24]. Visual and haptic information is integrated optimally: judgments are more precise (less variable) for visual-haptic stimuli than when based on either vision or haptics alone[25,26] This contrasts with multi-sensory findings in material perception: some visual-haptic averaging has been found in surface roughness perception, studies have not found improvements in precision when both modalities provide information, relative to a single modality[12,15]. Visual and haptic stimuli were matched in size, shape and www.nature.com/scientificreports/

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.