Abstract
Groundwater level is an effective parameter in the determination of accuracy in groundwater modeling. Thus, application of simple tools to predict future groundwater levels and fill-in gaps in data sets are important issues in groundwater hydrology. Prediction and simulation are two approaches that use previous and previous-current data sets to complete time series. Artificial intelligence is a computing method that is capable to predict and simulate different system states without using complex relations. This paper investigates the capability of an adaptive neural fuzzy inference system (ANFIS) and genetic programming (GP) as two artificial intelligence tools to predict and simulate groundwater levels in three observation wells in the Karaj plain of Iran. Precipitation and evaporation from a surface water body and water levels in observation wells penetrating an aquifer system are used to fill-in gaps in data sets and estimate monthly groundwater level series. Results show that GP decreases the average value of root mean squared error (RMSE) as the error criterion for the observation wells in the training and testing data sets 8.35 and 11.33 percent, respectively, compared to the average of RMSE by ANFIS in prediction. Similarly, the average value of RMSE for different observation wells used in simulation improves the accuracy of prediction 9.89 and 8.40 percent in the training and testing data sets, respectively. These results indicate that the proposed prediction and simulation approach, based on GP, is an effective tool in determining groundwater levels.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have