Abstract

We propose a process to compose stereoscopic images based on measured values in an automotive 3D Head‐Up‐Display (3D HUD) equipped with parallax barrier and eye‐tracking camera. The information required for compositing is measured using an RGB camera and linked to the camera (eye) position coordinates. And the measurement is performed at dozens of camera locations. Then by approximating the information with a polynomial with the camera position and LCD pixel coordinates as a variable, the information required for image composition is generated at any camera (eye) position. This calibrates the effects of variations and distortions in the mounting of windshields and eye tracking cameras, and realizes optimal low crosstalk 3D image rendering for any eye position in real time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call