Transformer methods have shown strong predictive performance in long-term time series prediction. However, its attention mechanism destroys temporal dependence and has quadratic complexity. This makes prediction processes difficult to interpret, limiting their application in tasks requiring interpretability. To address this issue, this paper proposes a highly interpretable long-term sequence forecasting model, TFformer. TFformer decomposes time series into low frequency trend component and high frequency period component by frequency decomposition, and forecasts them respectively. The periodic information in high-frequency component is enhanced with the sequential frequency attention, and then the temporal patterns of the two components are obtained by feature extraction. According to the period property in time domain, TFformer through periodic extension to predict the future period patterns using sequential periodic matching attention. Finally, the predicted future period pattern and the extracted trend pattern are reconstructed to future series. TFformer provides an interpretable forecasting process with low time complexity, as it retains temporal dependence using sequence-level attentions. TFformer achieves significant prediction performance in both univariate and multivariate forecasting across six datasets. Detailed experimental results and analyses verify the effectiveness and generalization of TFformer.
Read full abstract