Abstract

Decoding the user's natural grasp intent enhances the application of wearable robots, improving the daily lives of individuals with disabilities. Electroencephalogram (EEG) and eye movements are two natural representations when users generate grasp intent in their minds, with current studies decoding human intent by fusing EEG and eye movement signals. However, the neural correlation between these two signals remains unclear. Thus, this paper aims to explore the consistency between EEG and eye movement in natural grasping intention estimation. Specifically, six grasp intent pairs are decoded by combining feature vectors and utilizing the optimal classifier. Extensive experimental results indicate that the coupling between the EEG and eye movements intent patterns remains intact when the user generates a natural grasp intent, and concurrently, the EEG pattern is consistent with the eye movements pattern across the task pairs. Moreover, the findings reveal a solid connection between EEG and eye movements even when taking into account cortical EEG (originating from the visual or motor cortex) and the presence of a suboptimal classifier. Overall, this work uncovers the coupling correlation between EEG and eye movements and provides a reference for intention estimation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call