Abstract

In this paper, we present a two-handed 3D interaction approach for immersive virtual reality applications on a large vertical display. The proposed interaction scheme is based on hybrid motion sensing technology that tracks the 3D position and orientation of multiple handheld devices. More specifically, the devices have embedded ultrasonic and inertial sensors to accurately identify their position and attitude in the air. The interaction architecture is designed for pointing and object manipulation tasks. Since the sensor system guarantees 3D spatial information only, we develop an algorithm to exactly track the position of interest produced by the pointing task. For the object manipulation, we have carefully assigned one-handed and two-handed interaction schemes for each task. One-handed interaction includes selection and translation while rotation and scaling are assigned for the two-handed interaction. By combining one-handed and two-handed interactions, we believe that the presented system provide users with more intuitive and natural interaction for 3D object manipulation. The feasibility and validity of the proposed method are validated through user tests.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call