Abstract

AbstractVisual aberrations are the imperfections in human vision, which play an important role in our everyday lives. Existing algorithms to simulate such conditions are either not suited for low‐latency workloads or limit the kinds of supported aberrations. In this paper, we present a new simulation method that supports arbitrary visual aberrations and runs at interactive, near real‐time performance on commodity hardware. Furthermore, our method only requires a single set of on‐axis phase aberration coefficients as input and handles the dynamic change of pupil size and focus distance at runtime. We first describe a custom parametric eye model and parameter estimation method to find the physical properties of the simulated eye. Next, we talk about our parameter sampling strategy which we use with the estimated eye model to establish a coarse point‐spread function (PSF) grid. We also propose a GPU‐based interpolation scheme for the kernel grid which we use at runtime to obtain the final vision simulation by extending an existing tile‐based convolution approach. We showcase the capabilities of our eye estimation and rendering processes using several different eye conditions and provide the corresponding performance metrics to demonstrate the applicability of our method for interactive environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.