Abstract

Abstract Human spatial search combines visual search and motion control problems. Both have been investigated separately over decades, however, the coordination between visual search and motion control has not been investigated. Analyzing coordination of sensory-motor behavior through teleoperation could help improve understanding of human search strategies as well as autonomous search algorithms. This research proposes a novel approach to analyze the coordination between visual attention via gaze patterns and motion control. The approach is based on estimation of human operators’ 3D gaze using Gaussian mixture model (GMM), hidden Markov model (HMM) and sparse inverse covariance estimation (SICE). The analysis of the human experimental data demonstrates that fixation is used primarily to look at the target, smooth pursuit is coupled to robot rotation and used to search for new explored area, and saccade is coupled with forward motion and used to search for new explored area. These insights are used to build a functional model of human teleoperation search.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.