Abstract

AbstractCalibration of a distributed hydrologic model (WetSpa) for modeling river flows is performed using automatic parameter optimization. The main purpose of this research is to provide more confidence in the uncertainty analysis of the model parameters and predictions. A Box-Cox transformation and an autoregressive integrated moving average (ARIMA) time series model are used to transform the correlated and nonstationary model residuals to white noise disturbances, which can be minimized by ordinary least squares optimization. The WetSpa model is applied to the Illinois River basin, with a spatial resolution of 30 m and 1 h time step for a 10-year simulation period (1996–2006). The model is calibrated using river flow records (1996–2002) and validated using the remaining flow data (2002–2006). The results show that simple calibration of the model is inaccurate, as the residuals exhibit heteroscedasticity, which results in inaccurate estimates of the model parameters and large prediction uncertainty. Th...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call