Abstract

Humans are capable of haptically perceiving the shape of an object by simply wielding it, even without seeing it. On the other hand, typical hand-held controllers for virtual reality (VR) applications are pre-designed for general applications, and thus not capable of providing appropriate haptic shape perception when wielding specific virtual objects. Contradiction between haptic and visual shape perception causes a lack of immersion and leads to inappropriate object handling in VR. To solve this problem, we propose a novel method for designing hand-held VR controllers which illusorily represent haptic equivalent of visual shape in VR. In ecological psychology, it has been suggested that the perceived shape can be modeled using the limited mass properties of wielded objects. Based on this suggestion, we built a shape perception model using a data-driven approach; we aggregated data of perceived shapes against various hand-held VR controllers with different mass properties, and derived the model using regression techniques. We implemented a design system which enables automatic design of hand-held VR controllers whose actual shapes are smaller than target shapes while maintaining their haptic shape perception. We verified that controllers designed with our system can present aimed shape perception irrespective of their actual shapes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.