Abstract

An experiment is described that tested the possibility to classify wooden, plastic, and metallic objects based on reproduced auditory and vibrotactile stimuli. The results show that recognition rates are considerably above chance level with either unimodal auditory or vibrotactile feedback. Supported by those findings, the possibility to render virtual buttons for professional appliances with different tactile properties was tested. To this end, a touchscreen device was provided with various types of vibrotactile feedback in response to the sensed pressing force and location of a finger. Different virtual buttons designs were tested by user panels who performed a subjective evaluation on perceived tactile properties and materials. In a first implementation, virtual buttons were designed reproducing the vibration recordings of real materials used in the classification experiment: mainly due to hardware limitations of our prototype and the consequent impossibility to render complex vibratory signals, this approach did not prove successful. A second implementation was then optimized for the device capabilities, moreover introducing surface compliance effects and button release cues: the new design led to generally high quality ratings, clear discrimination of different buttons and unambiguous material classification. The lesson learned was that various material and physical properties of virtual buttons can be successfully rendered by characteristic frequency and decay cues if correctly reproduced by the device.

Highlights

  • Even though everyday human interaction with objects and events is mostly multisensory [37], sight is often fundamental

  • Participants reported that buttons reproducing wood (WS, Wood Filtered (WF)) and plastic (PS, Plastic Filtered (PF)) rendered similar stimuli, whereas buttons with metal feedback (MS, Metal Filtered (MF)) differed from the others in both sets

  • Besides confirming previous findings on the performance of auditory-based classification, our results prove that the use of degraded tactile feedback enable equivalently good classification while keeping low mismatch rates

Read more

Summary

Introduction

Even though everyday human interaction with objects and events is mostly multisensory [37], sight is often fundamental. There are situations in which one can rely on auditory and/or haptic cues only, for instance when a user is involved in multiple activities, or when an interface is visually occluded. As a specific case study, the classification of materials is usually mainly based on visual cues [35], it may rely on touch and/or audition. This happens as a consequence of tactile exploration or other excitation (e.g., by tapping) of an object’s natural resonances [31]. Journal on Multimodal User Interfaces (2020) 14:255–269 the human ability to identify materials via the auditory or haptic modalities. The parchment-skin illusion experiment showed how acoustic information affects the perceived roughness during hand rubbing [18], while another study demonstrated differences in the perceived stiffness through audition or touch [27]

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.