Abstract

Abstract One unique capability of augmented reality (AR) is to visualize hidden objects as a virtual overlay on real occluding objects. This “X-ray vision” visualization metaphor has proved to be invaluable for operation and maintenance tasks such as locating utilities behind a wall. Locating virtual occluded objects requires users to estimate the closest projected positions of the virtual objects upon their real occluders, which is generally under the influence of a parallax effect. In this paper we studied the task of locating virtual pipes behind a real wall with “X-ray vision” and the goal is to establish relationships between task performance and spatial factors causing parallax through different forms of visual augmentation. We introduced and validated a laser-based target designation method which is generally useful for AR-based interaction with augmented objects beyond arm's reach. The main findings include that people can mentally compensate for the parallax error when extrapolating positions of virtual objects on the real surface given traditional 3D depth cues for spatial understanding. This capability is, however, unreliable especially in the presence of the increasing viewing offset between the users and the virtual objects as well as the increasing distance between the virtual objects and their occluders. Experiment results also show that positioning performance is greatly increased and unaffected by those factors if the AR support provides visual guides indicating the closest projected positions of virtual objects on the surfaces of their real occluders.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call