The remarkable impact of transformers in artificial intelligence, exemplified by applications like GPT-3 in language processing, has sparked interest in their potential for time series analysis. This study aims to explore whether transformers, specifically temporal fusion transformers (TFT), can outperform conventional methods in this domain. The research question is whether TFT exhibits superior performance compared to conventional recurrent neural network (RNN) methods, specifically gated recurrent unit (GRU), and traditional machine learning approaches, notably autoregressive integrated moving average (ARIMA), in the context of time series analysis and temperature prediction. A comparative analysis is conducted among three models: ARIMA, GRU, and TFT. The study utilizes time series data spanning from 1984 to the end of 2022. The models’ performances are evaluated using multiple metrics: mean absolute error (MAE), root mean square error (RMSE), mean absolute percentage error (MAPE), and the coefficient of determination (R2). The TFT model achieves the lowest MAE, indicating high accuracy in its predictions. It outperforms both the RNN and traditional machine learning in temperature prediction tasks. Integrating the TFT model with the FAO penman-monteith method could improve irrigation scheduling due to more accurate temperature predictions, potentially enhancing water efficiency and crop yields.
Read full abstract