In Electroencephalography (EEG)-based affective brain-computer interfaces (aBCIs), there is a consensus that EEG features extracted from different frequency bands and channels have different abilities in emotion expression. Besides, EEG is so weak and non-stationary that easily causes distribution discrepancies for EEG data collected at different times; therefore, it is necessary to explore the affective activation patterns in cross-session emotion recognition. To address these two problems, we propose a self-weighted semi-supervised classification (SWSC) model in this paper for joint EEG-based cross-session emotion recognition and affective activation patterns mining, whose merits include 1) using both the labeled and unlabeled samples from different sessions for better capturing data characteristics, 2) introducing a self-weighted variable to learn the importance of EEG features adaptively and quantitatively, and 3) mining the activation patterns including the critical EEG frequency bands and channels automatically based on the learned self-weighted variable. Extensive experiments are conducted on the benchmark SEED_IV emotional data set and SWSC obtained excellent average accuracies of 77.40%, 79.55% and 81.52% in three cross-session emotion recognition tasks. Moreover, SWSC identifies that the Gamma frequency band contributes the most and the EEG channels in prefrontal, left/right temporal and (central) parietal lobes are more important for cross-session emotion recognition.