Abstract

We present Snow, a cross-modal interface that integrates cold and tactile stimuli in mid-air to create snowflakes and raindrops for VR experiences. Snow uses six Peltier packs and an ultrasound haptic display to create unique cold-tactile sensations for users to experience catching snowflakes and getting rained on their bare hands. Our approach considers humans' ability to identify tactile and cold stimuli without masking each other when projected onto the same location on their skin, making illusions of snowflakes and raindrops. We design both visual and haptic renderings to be tightly coupled to present snow melting and rain droplets for realistic visuo-tactile experiences. For multiple snowflakes and raindrops rendering, we propose an aggregated haptic scheme to simulate heavy snowfall and rainfall environments with many visual particles. The results show that the aggregated haptic rendering scheme demonstrates a more realistic experience than other schemes. We also confirm that our approach of providing cold-tactile cues enhances the user experiences in both conditions compared to other modality conditions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call