Abstract

In human perception, the ability to determine eye height is essential, because eye height is used to scale heights of objects, velocities, affordances and distances, all of which allow for successful environmental interaction. It is well understood that eye height is fundamental to determine many of these percepts. Yet, how eye height itself is provided is still largely unknown. While the information potentially specifying eye height in the real world is naturally coincident in an environment with a regular ground surface, these sources of information can be easily divergent in similar and common virtual reality scenarios. Thus, we conducted virtual reality experiments where we manipulated the virtual eye height in a distance perception task to investigate how eye height might be determined in such a scenario. We found that humans rely more on their postural cues for determining their eye height if there is a conflict between visual and postural information and little opportunity for perceptual-motor calibration is provided. This is demonstrated by the predictable variations in their distance estimates. Our results suggest that the eye height in such circumstances is informed by postural cues when estimating egocentric distances in virtual reality and consequently, does not depend on an internalized value for eye height.

Highlights

  • Eye height is a reliable metric to scale for example the heights of objects [1, 2], velocities [3], affordances [4] and egocentric distances [5, 6]

  • The sensory modality potentially used to specify eye height in virtual environments had not yet been empirically investigated for tasks in action space. In this set of experiments, we found that there are instances where humans seem to rely more on their postural information for determining eye height in virtual environments as demonstrated by the predictable variations in their distance estimates

  • Our Experiments 1–3 demonstrate that variations in the virtual eye height have predictable effects on perceived egocentric distances and that these effects are consistent across different postures

Read more

Summary

Introduction

Eye height is a reliable metric to scale for example the heights of objects [1, 2], velocities [3], affordances [4] and egocentric distances [5, 6]. Sedgwick [6] proposed that the angle of declination below the horizon (AoD) to a target on the ground in combination with the (known) eye height (EH) of the observer can be used to determine distances (d) following the equation d = EH/tan(AoD) (see [5, 16]) This tight coupling of eye height and perceived egocentric distances enables us to make predictions about how each potential source of information for determining eye height in VR portrayed in a head-mounted display (HMD) may influence perceived distance. We expect that observers in a virtual environment determine their eye height by relying on postural cues, because they are not able to use or ignore visual information potentially. This change should lead to a compression of distances following an increase of visual eye height and expansion of distances following decreases in visual eye height compared to the baseline estimates where the virtual eye height is matching the postural eye height

Method
Results
Design and procedure
General Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call