Abstract

Mental fatigue detection based on Electroencephalogram (EEG) is an objective and effective detection method. However, individual variability and variability in mental fatigue experimental paradigms limit the generalizability of classification models across subjects and experiments. This paper proposes a Spatio-Temporal Transformer (STTransformer) architecture based on a two-stream attention network. We use datasets from three different mental fatigue experimental tasks and individuals. STTransformer has performed cross-task and cross-subject mental fatigue transfer learning and achieved promising results. This architecture is based on the idea of model migration, pre-training deep neural network parameters in the source domain to obtain prior knowledge, freezing some network parameters and migrating to the target domain containing similar samples for fine-tuning. This architecture achieves good transfer effects by using multiple attention mechanisms to capture common features between different individuals and experimental paradigms. Good performance was achieved in multiple individual and two mental fatigue experiments. We used the attention mechanism to visualize part of the feature maps, showing two characteristics of mental fatigue, and exploring deep learning interpretability.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.