Abstract
Improving the performance of long-term time series forecasting is important for real-world applications. Recently, Transformer-based models have achieved significant performance gains in long-term time series prediction. However, these models are memory-intensive and cannot capture temporal patterns at multiple scales. To this end, we propose to integrate the time series decomposition method in the Transformer framework to enable the model to extract short- and long-term time patterns in more predictable seasonal and trend components. In this paper, we propose a Transformer-based model named CLformer. Different from previous methods, we exploit dilated convolutional networks to capture and refine multiple temporally repeated patterns in time series before time series decomposition. To enable the model to capture the dependencies at multiple scales, we propose a local group autocorrelation (LGAC) mechanism. The LGAC mechanism calculates autocorrelation within time series segments, strengthening the model’s ability to capture the local temporal dynamics of series. The stacking of multiple LGAC layers enables the model to capture multi-scale dependencies, which in turn improves the model’s predictive performance. The CLformer outperforms models using the global autocorrelation mechanism and self-attention in both efficiency and accuracy. Experimental results on six benchmark datasets show that our model obtains a relative performance improvement of 11.75% compared to the state-of-the-art methods. In addition, CLformer achieves a relative performance improvement of 18.89% on two datasets without apparent periodicity, demonstrating the effectiveness of our model on time series without significant periodicity.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.