Abstract

The lack of information to manage groundwater for irrigation is one of the biggest concerns for farmers and stakeholders in agricultural areas of Mississippi. In this study, we present a novel implementation of a nonlinear autoregressive with exogenous inputs (NARX) network to simulate daily groundwater levels at a local scale in the Mississippi River Valley Alluvial (MRVA) aquifer, located in the southeastern United States. The NARX network was trained using the Levenberg-Marquardt (LM) and Bayesian Regularization (BR) algorithms, and the results were compared to identify an optimal architecture for the forecasting of daily groundwater levels over time. The training algorithms were implemented using different hidden node combinations and delays (5, 25, 50, 75, and 100) until the optimal network was found. Eight years of daily historical input time series including precipitation and groundwater levels were used to forecast groundwater levels up to three months ahead. The comparison between LM and BR showed that NARX-BR is superior in forecasting daily levels based on the Mean Squared Error (MSE), coefficient of determination (R2), and Nash-Sutcliffe coefficient of efficiency. The results showed that BR with two hidden nodes and 100 time delays provided the most accurate prediction of groundwater levels with an error of ± 0.00119 m. This innovative study is the first of its kind and will provide significant contributions for the implementation of data-based models (DBMs) in the prediction and management of groundwater for agricultural use.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call