Abstract

Recently, Electrooculography-based Human-Computer Interaction (EOG-HCI) technology has gained widespread attention in industrial areas, including assistive robots, augmented reality in gaming, etc. However, as the fundamental step of EOG-HCI, accurate eye movement classification (EMC) still faces a significant challenge, where their constraints in extracting discriminative features limit the performance of most existing works. To address this issue, a Residual Self-Calibrated Network with Multi-Scale Channel Attention (RSCA), focusing on efficient feature extraction and enhancement is proposed. The RSCA network first employs three self-calibrated convolution blocks within a hierarchical residual framework to fully extract the discriminative multi-scale features. Then, a multi-scale channel attention module adaptively weights the learned features to screen out the discriminative representation by aggregating the multi-scale context information along the channel dimension, thus further boosting the performance. Comprehensive experiments were performed using 5 public datasets and 7 prevailing methods for comparative validation. The results confirm that the RSCA network outperforms all other methods significantly, establishing a state-of-the-art benchmark for EOG-based EMC. Furthermore, thorough ablation analyses confirm the effectiveness of the employed modules within the RSCA network, providing valuable insights for the design of EOG-based deep models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.