Abstract

As the use of virtual and augmented reality applications becomes more common, the need to fully understand how observers perceive spatial relationships grows more critical. One of the key requirements in engineering a practical virtual or augmented reality system is accurately conveying depth and layout. This requirement has frequently been assessed by measuring judgments of egocentric depth. These assessments have shown that observers in virtual reality (VR) perceive virtual space as compressed relative to the real-world, resulting in systematic underestimations of egocentric depth. Previous work has indicated that similar effects may be present in augmented reality (AR) as well.This paper reports an experiment that directly measured egocentric depth perception in both VR and AR conditions; it is believed to be the first experiment to directly compare these conditions in the same experimental framework. In addition to VR and AR, two control conditions were studied: viewing real-world objects, and viewing real-world objects through a head-mounted display. Finally, the presence and absence of motion parallax was crossed with all conditions. Like many previous studies, this one found that depth perception was underestimated in VR, although the magnitude of the effect was surprisingly low. The most interesting finding was that no underestimation was observed in AR.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.