Mental fatigue detection based on Electroencephalogram (EEG) is an objective and effective detection method. However, individual variability and variability in mental fatigue experimental paradigms limit the generalizability of classification models across subjects and experiments. This paper proposes a Spatio-Temporal Transformer (STTransformer) architecture based on a two-stream attention network. We use datasets from three different mental fatigue experimental tasks and individuals. STTransformer has performed cross-task and cross-subject mental fatigue transfer learning and achieved promising results. This architecture is based on the idea of model migration, pre-training deep neural network parameters in the source domain to obtain prior knowledge, freezing some network parameters and migrating to the target domain containing similar samples for fine-tuning. This architecture achieves good transfer effects by using multiple attention mechanisms to capture common features between different individuals and experimental paradigms. Good performance was achieved in multiple individual and two mental fatigue experiments. We used the attention mechanism to visualize part of the feature maps, showing two characteristics of mental fatigue, and exploring deep learning interpretability.
Read full abstract