Groundwater, an essential water resource in Taiwan, is closely linked to land subsidence in the Zhuoshui River basin due to excessive exploitation. Therefore, there is a critical need for robust monitoring and predictive tools to facilitate effective management of water resources. Despite the success of Transformer neural networks in natural language processing, their potential in environmental research remains underexplored. This study explores the feasibility of employing a Transformer-based deep neural network to simultaneously predict groundwater levels at eight strategically positioned monitoring stations in the distal fan of the Zhuoshui River basin in central Taiwan. Utilizing a 20-year dataset with 10-day intervals, we compare the Transformer model with benchmarks established separately by the Convolutional Neural Network (CNN), the Long Short-Term Memory neural network (LSTM), and the Feedforward Neural Network (FFNN). Results reveal the Transformer's superior performance in predicting groundwater levels 10 and 20 days in advance. This achievement is attributed to Transformer's self-attention mechanism capable of capturing relationships between factors like rainfall and river flow, enhancing accuracy in predicting trends and peak values. This groundbreaking study underscores the Transformer model's efficacy in groundwater level prediction, emphasizing its significance for sustainable water resource management amid climate change. The findings open new avenues for environmental research and water resource management, not only in Taiwan but also globally.
Read full abstract