Abstract
Accurate traffic prediction plays a crucial role in ensuring traffic safety and minimizing property damage. The utilization of STGNNs in traffic prediction has gained significant attention from researchers aiming to capture the intricate time-varying relationships within traffic data. While existing STGNNs commonly rely on Euclidean distance to assess the similarity between nodes, which may fall short in reflecting POI or regional functions. The traffic network exhibits static from a macro perspective, whereas undergoes dynamic changes in the micro perspective. Previous researchers incorporating self-attention to capture time-varying features for constructing dynamic graphs have faced challenges in overlooking the connections between nodes due to the Softmax polarization effect, which tends to amplify extreme value differences, and fails to accurately represent the true relationships between nodes. To solve this problem, we introduce the Multi-Scaled Spatio-Temporal Graph Neural Networks (MSSTGNN), which aims to comprehensively capture characteristics within traffic from multiscale viewpoints to construct multi-perspective graphs. We employ a trainable matrix to enhance the predefined adjacency matrices and to construct an optimal dynamic graph based on both trend and period. Additionally, a graph aggregate technique is proposed to effectively merge trend and periodic dynamic graphs. TCN is developed to model nonstationary traffic data, and we leverage the skip and residual connections to increase the model depth. A two-stage learning approach and a novel MSELoss function is designed to enhance the model's performance. The experimental results demonstrate that the MSSTGNN model outperforms the existing methods, achieving state-of-the-art performances across multiple real-world datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.