Sequence recommendation is used to predict the user's next potentially interesting items and behaviors. It not only focuses on the user's independent interaction behavior, but also considers the user's historical behavior sequence. However, sequence recommendation still faces some challenges: the existing models still have shortcomings in addressing long-term dependencies and fully utilizing contextual information in sequence recommendation. To address these challenges, we propose a four-channel model based on a multi-level self-attention network with gated spiking neural P (GSNP) systems, termed SR-MAG model. The four channels are divided into two groups, and each group is composed of an attention channel and an GSNP attention channel. Moreover, they process long-term sequences and short-term sequences respectively to obtain long-term or short-term attention channel features. These features are then passed through a self-attention network to effectively extract user context information. The proposed SR-MAG model is tested on three real datasets and compared with 10 baseline methods. Experimental results demonstrate the effectiveness of the proposed SR-MAG model in sequence recommendation tasks.