We present a novel framework for the detection and continuous evaluation of 3D motion perception by deploying a virtual reality environment with built-in eye tracking. We created a biologically-motivated virtual scene that involved a ball moving in a restricted Gaussian random walk against a background of 1/f noise. Sixteen visually healthy participants were asked to follow the moving ball while their eye movements were monitored binocularly using the eye tracker. We calculated the convergence positions of their gaze in 3D using their fronto-parallel coordinates and linear least-squares optimization. Subsequently, to quantify 3D pursuit performance, we employed a first-order linear kernel analysis known as the Eye Movement Correlogram technique to separately analyze the horizontal, vertical and depth components of the eye movements. Finally, we checked the robustness of our method by adding systematic and variable noise to the gaze directions and re-evaluating 3D pursuit performance. We found that the pursuit performance in the motion-through depth component was reduced significantly compared to that for fronto-parallel motion components. We found that our technique was robust in evaluating 3D motion perception, even when systematic and variable noise was added to the gaze directions. The proposed framework enables the assessment of 3D Motion perception by evaluating continuous pursuit performance through eye-tracking. Our framework paves the way for a rapid, standardized and intuitive assessment of 3D motion perception in patients with various eye disorders.
Read full abstract