Abstract

An off-axis monocular pupil tracker designed for eventual integration in ophthalmoscopes for eye movement stabilization is described and demonstrated. The instrument consists of light-emitting diodes, a camera, a field-programmable gate array (FPGA) and a central processing unit (CPU). The raw camera image undergoes background subtraction, field-flattening, 1-dimensional low-pass filtering, thresholding and robust pupil edge detection on an FPGA pixel stream, followed by least-squares fitting of the pupil edge pixel coordinates to an ellipse in the CPU. Experimental data suggest that the proposed algorithms require raw images with a minimum of ∼32 gray levels to achieve sub-pixel pupil center accuracy. Tests with two different cameras operating at 575, 1250 and 5400 frames per second trained on a model pupil achieved 0.5-1.5 μm pupil center estimation precision with 0.6-2.1 ms combined image download, FPGA and CPU processing latency. Pupil tracking data from a fixating human subject show that the tracker operation only requires the adjustment of a single parameter, namely an image intensity threshold. The latency of the proposed pupil tracker is limited by camera download time (latency) and sensitivity (precision).

Highlights

  • The human eye is in constant involuntary movement, even when fixating on a target [1,2,3]

  • The key to achieving pupil tracking with low latency is to process pixel values as soon as they arrive to the field-programmable gate array (FPGA), through what it is often called a “pixel stream.”

  • A background image is generated as the median of a sequence of images collected with the camera region of interest (ROI), gain and exposure settings to be used for pupil tracking, and with the first lens of the optical setup covered

Read more

Summary

Introduction

The human eye is in constant involuntary movement (rotation), even when fixating on a target [1,2,3]. It is important to recognize that retina tracking accuracy through pupil imaging, could be fundamentally limited by the fact that the eyeball is not a rigid body [32,33], and that the crystalline lens wobbles in response to saccades [34,35] This wobble, can be corrected through modeling of the lens as a damped harmonic oscillator, after estimation of the undamped angular frequency and damping ratio of each eye [35].

Methods
Illumination
Cameras
Optical setups
Computing hardware
Raw data re-packaging
Background subtraction and field flattening
Low-pass filtering
Thresholding
Detection of left-right edge pairs
2.10. Ellipse fitting
2.11. Subjects
Image dynamic range
Precision
Latency
Human subject pupil tracking
Summary
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.