Abstract

This paper presents a human-scale virtual environment (VE) with haptic feedback along with two experiments performed in the context of product design. The user interacts with a virtual mock-up using a large-scale bimanual string-based haptic interface called SPIDAR (Space Interface Device for Artificial Reality). An original self-calibration method is proposed. A vibro-tactile glove was developed and integrated to the SPIDAR to provide tactile cues to the operator. The purpose of the first experiment was: (1) to examine the effect of tactile feedback in a task involving reach-and-touch of different parts of a digital mock-up, and (2) to investigate the use of sensory substitution in such tasks. The second experiment aimed to investigate the effect of visual and auditory feedback in a car-light maintenance task. Results of the first experiment indicate that the users could easily and quickly access and finely touch the different parts of the digital mock-up when sensory feedback (either visual, auditory, or tactile) was present. Results of the of the second experiment show that visual and auditory feedbacks improve average placement accuracy by about 54 % and 60% respectively compared to the open loop case

Highlights

  • According to their sizes, we could divide virtual environments (VEs) into two categories: small-scale or desk-top VEs and human-scale VEs

  • Paljic and Coquillart proposed a passive stringed-based haptic feedback system that can provide the user with grounded forces in a 3D manipulation space [26]

  • The system is solvable and provides the following parameters (8): These results prove that the SPIDAR can be calibrated without an external system

Read more

Summary

INTRODUCTION

We could divide virtual environments (VEs) into two categories: small-scale or desk-top VEs and human-scale VEs. Desk-top VEs include all situations where the user is sitting still in front of a desktop monitor or wearing a Binocular Omni-Orientation Monitor (BOOM) [1]. In large-scale assembly simulations, the operator needs to operate and interact with virtual objects in a large workspace. This lead to some challenges concerning human-scale haptics (kinesthetic and tactile). Both accessibility testing and assembly simulations are interactive processes involving the operator and the handled objects, and simulation environments must be able to react according to the user’s actions.

HAPTIC INTERFACES
VIREPSE
PRODUCT DESIGN APPLICATION
Findings
REACH AND TOUCH EXPERIMENT
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call