Abstract

Decoding the motion intention of the human upper limb from electroencephalography (EEG) signals has important practical values. However, existing decoding models are built under the attended state while subjects perform motion tasks. In practice, people are often distracted by other tasks or environmental factors, which may impair decoding performance. To address this problem, in this paper, we propose a hierarchical decoding model of human upper limb motion intention from EEG signals based on attention state estimation. The proposed decoding model includes two components. First, the attention state detection (ASD) component estimates the attention state during the upper limb movement. Next, the motion intention recognition (MIR) component decodes the motion intention by using the decoding models built under the attended and distracted states. The experimental results show that the proposed hierarchical decoding model performs well under the attended and distracted states. This work can advance the application of human movement intention decoding and provides new insights into the study of brain-machine interfaces.

Highlights

  • ELECTROENCEPHALOGRAPHY (EEG) signals can reflect brain activities [1]

  • Wilcoxon test showed that the accuracy difference between the two states was statistically significant (p=0.035

  • To address the effect of the attention state on the decoding performance of the upper limb motion intention, we proposed a hierarchical decoding model of the upper limb motion intention by integrating a recognition model of attention states with decoding models of upper limb movement intention built under the attended and distracted states, respectively

Read more

Summary

Introduction

ELECTROENCEPHALOGRAPHY (EEG) signals can reflect brain activities [1]. Studies have shown that it is feasible to use EEG signals to decode mental states and human movement intentions [2]. Researchers have conducted numerous studies on upper limb movement intention decoding from EEG signals. In 2008, Hammon et al [3] were the first to extract and parse information related to hand movement from EEG signals and used this information as a feature to identify hand motion intention and direction. In 2012, Eileen et al [4] used EEG signals of 0.1-4 Hz to detect the self-paced reaching movement intention of left and right hands. In 2014, Eduardo et al [5] examined motor intention from the EEG correlation of 7

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call