Accurate forecasting of fluctuations in groundwater table is crucial for the effective management of regional water resources. This study explores the potential of utilizing remotely sensed satellite data to predict and forecast water table variations. Specifically, two Artificial Neural Network (ANN) models were developed to simulate water table fluctuations at two distinct well sites, namely BA Ea 18 and FR Df 35 in Maryland. One model leveraged the relationship between variations in brightness temperature and water table depth, while the other model was founded on the association between changes in soil moisture and water table depth. These models were trained and validated using recorded water table depths from the aforementioned wells, brightness temperature data acquired from the Advanced Microwave Scanning Radiometer—Earth Observing System (AMSR-E), and soil moisture information generated using the Land Data Assimilation System (LDAS). All models exhibited strong performance in predicting and forecasting water table fluctuations, with root mean square errors ranging from 0.043 m to 0.047 m for a 12-month forecasting horizon. Sensitivity tests revealed that the models displayed greater sensitivity to uncertainties in water table depth compared to uncertainties in both brightness temperature and soil moisture content. This underscores the feasibility of constructing an ANN-based water table prediction model, even in cases where high-resolution remotely sensed data is unavailable. In such situations, the model’s efficacy is contingent on the compatibility of the time series trends in data, such as brightness temperature or soil moisture, with those observed at the study site.
Read full abstract