Abstract

An improved transformer model based on a multi-head attention mechanism was constructed for long-term prediction of PM2.5 concentration. The monitoring data and meteorological data from 12 air monitoring stations in Beijing from March 2013 to December 2016 were collected and used in the transformer model. The Pearson coefficient was used to explore the key factors affecting PM2.5 concentration. A convolutional neural network model (ResNet50) and long short-term memory network model (LSTM) were introduced for comparison and explanatory variance (EVS), coefficient of determination (R2), mean square error (MSE), and mean absolute error (MAE) were selected to evaluate the performance of the model. The Pearson coefficient results showed that PM10, SO2, and NO2, CO, and atmospheric pressure (PRES) were highly correlated with PM2.5 concentration, and dew point temperature (DEWP) was strongly correlated with PM2.5 concentration, which was consistent with the preference setting of the model's automatic screening. The MSE and R2 of the transformer model are 0.009 μg·m-3 and 0.925, respectively, which decreased by 91.09% and 30.77% of MSE and increased by 38.05% and 4.65% of R2, respectively, when compared with ResNet50 and LSTM. The transformer model could capture short-term pollution changes caused by sudden changes in meteorological conditions and long-term trends with significant seasonal changes. The fitting effect of the transformer was excellent among several models, providing a novel method for long-term prediction of PM2.5 concentration. In addition, ablation experiments revealed that the increase in R2 of the transformer was relatively small after data input or manually setting preferences, with only a 2.31% and 1.51% increase, respectively, indicating that the transformer model had strong anti-interference ability for PM10 homologous data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.