Abstract

Abstract In projection‐based Virtual Reality (VR) systems, typically only one headtracked user views stereo images rendered from the correct view position. For other users, who are presented a distorted image, moving with the first user's head motion, it is difficult to correctly view and interact with 3D objects in the virtual environment. In close‐range VR systems, such as the Virtual Workbench, distortion effects are especially large because objects are within close range and users are relatively far apart. On these systems, multi‐user collaboration proves to be difficult. In this paper, we analyze the problem and describe a novel, easy to implement method to prevent and reduce image distortion and its negative effects on close‐range interaction task performance. First, our method combines a shared camera model and view distortion compensation. It minimizes the overall distortion for each user, while important user‐personal objects such as interaction cursors, rays and controls remain distortion‐free. Second, our method retains co‐location for interaction techniques to make interaction more consistent. We performed a user experiment on our Virtual Workbench to analyze user performance under distorted view conditions with and without the use of our method. Our findings demonstrate the negative impact of view distortion on task performance and the positive effect our method introduces. This indicates that our method can enhance the multi‐user collaboration experience on close‐range, projection‐based VR systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call