Abstract

Distances in virtual environments (VEs) viewed on a head-mounted display (HMD) are typically underperceived relative to the intended distance. This paper presents an experiment comparing perceived egocentric distance in a real environment with that in a matched VE presented in the Oculus Quest and Oculus Quest 2. Participants made verbal judgments and blind walking judgments to an object on the ground. Both the Quest and Quest 2 produced underperception. Verbal judgments in the VE were 82% and 75% of the object distance, in contrast with real world judgments that were 94% of the object distance. Blind walking judgments were 68% and 70% of object distance in the Quest and Quest 2, respectively, compared to 88% in the real world. This project shows that significant underperception of distance persists even in modern HMDs.

Highlights

  • In order for virtual reality (VR) to be fully effective, the accuracy with which distance is perceived in virtual environments (VEs) should be similar to that in real environments

  • Tukey post-hoc tests were used to evaluate the primary hypothesis that verbal distance judgments would be more accurate in the real environment compared to the Quest and Quest 2

  • Egocentric distance in a VE presented on the Oculus Quest and Oculus Quest 2 was underperceived relative to intended object distance, and relative to perceived distance measured in a real environment upon which the VE was based

Read more

Summary

Introduction

In order for virtual reality (VR) to be fully effective, the accuracy with which distance is perceived in virtual environments (VEs) should be similar to that in real environments. Distance in VR is consistently underperceived compared to the real world, especially when viewed through a head-mounted display (HMD) (Witmer and Kline, 1998; Thompson et al, 2004; Kelly et al, 2017). Real estate development stakeholders viewing a virtual walk-through of a planned structure (Ullah et al, 2018) will perceive the space to be smaller than intended, potentially leading to planning and decision errors. A soldier learning to operate a vehicle or to coordinate spatially with other troops in VR will learn a set of perception-action associations based on underperceived distances, which might require recalibration in the real environment. Misperception of spatial properties of the VE can undermine their value by introducing costly errors associated with decisions and actions based on the misperceived environment

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.