Abstract

ABSTRACTA new methodology is proposed for improving the accuracy of groundwater-level estimations and increasing the efficiency of groundwater-level monitoring networks. Three spatio-temporal (S-T) simulation models, numerical groundwater flow, artificial neural network and S-T kriging, are implemented to simulate water-table level variations. Individual models are combined using model fusion techniques and the more accurate of the individual and combined simulation models is selected for the estimation. Leave-one-out cross-validation shows that the estimation error of the best fusion model is significantly less than that of the three individual models. The selected fusion model is then considered for optimal S-T redesign of the groundwater monitoring network of the Dehgolan Plain (Iran). Using a Bayesian maximum entropy interpolation technique, soft data are included in the geostatistical analyses. Different scenarios are defined to incorporate economic considerations and different levels of precision in selecting the best monitoring network; a network of 37 wells is proposed as the best configuration. The mean variance estimation errors of all scenarios decrease significantly compared to that of the existing monitoring network. A reduction in equivalent uniform annual costs of different scenarios is achieved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call