Abstract

A common technology used to present immersive, collaborative environments is the head mounted virtual reality (VR) display. However, due to engineering limitations, variability in manufacturing, and person-to-person differences in eye position, virtual environments are often geometrically inaccurate. Correcting these inaccuracies typically requires complicated or interactive calibration procedures. In this document we present a method for calibrating head-mounted displays and other display surfaces using an automated, low-cost camera system. A unique aspect of this method is that the calibration of geometric distortions, field of view, and chromatic aberration are achieved without the need for a priori knowledge of the display system's intrinsic parameters. Since this calibration method can easily measure display distortions, we further extend our work to serve to measure the effect of eye position on the apparent location of imagery presented in a virtual reality head mounted display. We test a range of reasonable eye positions that may result from person-to-person variations in display placement and interpupilary distances. It was observed that the pattern of geometric distortions introduced by the display's optical system changes substantially as the eye moves from one position to the next. Though many commercial and research VR systems calibrate for interpupillary distance and optical distortions separately, this may be insufficient as eye position influences distortion characteristics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call