Abstract

BackgroundRobotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems.Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization.ResultsWithin the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic.ConclusionsThe proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.Electronic supplementary materialThe online version of this article (doi:10.1186/s12984-015-0107-4) contains supplementary material, which is available to authorized users.

Highlights

  • Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity

  • Classification of smooth pursuits, saccades and fixations Here we present a generic algorithm for classifying smooth-pursuits, saccades and fixations for tasks that present multiple moving objects simultaneously

  • Manual Classification determined that the majority of each Fixation Phase involved fixations of the fixation circle, whereas the Movement Phase included fixations, saccades, and pursuits

Read more

Summary

Introduction

Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. Singh et al Journal of NeuroEngineering and Rehabilitation (2016) 13:10 contribute to eye-hand coordination [5, 6] This possibility can be realized by adding video-based eye tracking to upper-extremity robots. Video-based eye trackers can non-invasively obtain information on where a subject is directly looking, commonly referred to as the “point-ofregard (POR)” [7] This information allows researchers to quantify overt mechanisms [8, 9] of visual search by ascertaining if objects of interest have been directly viewed (foveated)

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.