Abstract Researchers are focusing on improving time series forecasting methods to address real-world problems like COVID-19. Current methods show de-creased accuracy due to unpredictable seasonality, and enhancing models to handle long-term dependencies is crucial for better forecasting accuracy. This paper presents a Transformer self-attention based novel approach for infec-tious disease time series forecasting, specifically for COVID-19. The proposed method utilizes the Ensemble Empirical Mode Decomposition (EEMD) and Local Outlier Factor (LOF) methods for data pre-processing and to detect outliers. Next, a modified self-attention model based on Transformer neural network is introduced for predicting COVID-19 time series forecasting for the first time. The research specifically investigates the application of encod-er/decoder networks with an enhanced Positional Encoding approach. This involves using a novel time encoding technique on the input pattern to achieve more precise intended output. Consequently, the parameters in the Transformer model are adjusted using the Arithmetic Optimization Algorithm (AOA) to enhance the accuracy of the prediction. The model generates more accurate predictions over broader time intervals, with the lowest MAE of 371.92 and RMSE of 674.61, indicating superior predictive accuracy by ap-proximately 30% compared to other state-of-the-art methods. The proposed Transformer model has demonstrated significant improvements in robustness and forecasting accuracy compared to standard approaches such as LSTM, RNN, Exponential Smoothing, AutoARIMA, and TBATS for the COVID-19 time series of India, USA, and Brazil. The suggested model, due to its superior predictive accuracy, is applicable in diverse time series forecasting domains such as stock market trends, sales, and industrial consumption forecasting etc.
Read full abstract