Abstract

In the present study, a new hybrid artificial intelligence-based model is presented to simulate monthly time-series data relevant to groundwater quantity collected from an observation well located in Kermanshah, Iran. To define the hybrid AI model, the differential evolutionary is applied to optimize the Extreme Learning Machine (ELM) so that the Self Adaptive ELM (SAELM) is developed and the wavelet transform is employed to decompose the input variables of the model (WSAELM). The autocorrelation function is utilized for detecting effective lags of the time-series data. Moreover, 70% of the observed data is used to train the AI techniques and the rest (i.e. 30%) is applied in test mode. Then, using these influencing lags, different models are defined for the SAELM and WSAELM models. After that, superior models for the simulation of GWL are selected by evaluating different input combinations. For instance, the values of the Nash-Sutcliffe efficiency coefficient, variance accounted for (VAF) and correlation coefficient (R) for the superior WSAELM model are obtained to be 0.973, 97.450 and 0.988, respectively. The prediction uncertainty of the developed SAELM model for GWL is quantified with a value of ±0.081. Finally, a formula is developed for computing GWL. The obtained results show that the applied hybrid machine learning model can be utilized to estimate the groundwater quantity parameter (i.e. groundwater level) in the Karnachi well and the possible applicability of the WSAELM technique to other areas.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call