Abstract

Eye tracking has shown great promise in many scientific fields and daily applications, ranging from the early detection of mental health disorders to foveated rendering in virtual reality (VR). These applications all call for a robust system for high-frequency near-eye movement sensing and analysis in high precision, which cannot be guaranteed by the existing eye tracking solutions with CCD/CMOS cameras. To bridge the gap, in this paper, we propose Swift-Eye, an offline precise and robust pupil estimation and tracking framework to support high-frequency near-eye movement analysis, especially when the pupil region is partially occluded. Swift-Eye is built upon the emerging event cameras to capture the high-speed movement of eyes in high temporal resolution. Then, a series of bespoke components are designed to generate high-quality near-eye movement video at a high frame rate over kilohertz and deal with the occlusion over the pupil caused by involuntary eye blinks. According to our extensive evaluations on EV-Eye, a large-scale public dataset for eye tracking using event cameras, Swift-Eye shows high robustness against significant occlusion. It can improve the IoU and F1-score of the pupil estimation by 20% and 12.5% respectively, compared with the second-best competing approach, when over 80% of the pupil region is occluded by the eyelid. Lastly, it provides continuous and smooth traces of pupils in extremely high temporal resolution and can support high-frequency eye movement analysis and a number of potential applications, such as mental health diagnosis, behaviour-brain association, etc. The implementation details and source codes can be found at https://github.com/ztysdu/Swift-Eye.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call