Abstract

This paper presents an improved 3D eye movement analysis algorithm for binocular eye tracking within Virtual Reality for visual inspection training. The user's gaze direction, head position and orientation are tracked to allow recording of the user's fixations within the environment. The paper summarizes methods for (1) integrating the eye tracker into a Virtual Reality framework, (2) calculating the user's 3D gaze vector, and (3) calibrating the software to estimate the user's inter-pupillary distance post-facto. New techniques are presented for eye movement analysis in 3D for improved signal noise suppression. The paper describes (1) the use of Finite Impulse Response (FIR) filters for eye movement analysis, (2) the utility of adaptive thresholding and fixation grouping, and (3) a heuristic method to recover lost eye movement data due to miscalibration. While the linear signal analysis approach is itself not new, its application to eye movement analysis in three dimensions advances traditional 2D approaches since it takes into account the 6 degrees of freedom of head movements and is resolution independent. Results indicate improved noise suppression over our previous signal analysis approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.