The model calibration is one of the essential processes in the land surface model (LSM) simulations to achieve high performance and predictability for the use in integrated water resources and disaster risk management to climate change. Since the performance criteria often used to evaluate or calibrate hydrological models have their own advantages and limitations, it is necessary to understand the nature of each performance criterion in the model performance assessment or identification to the model application purpose. For the design flow estimates from hydrological modeling data, the Liu Mean Efficiency (LME) was proposed by reformulating the three metric components (correlation, variability, and bias measures) in the Nash and Sutcliffe efficiency (NSE) or the Kling and Gupta Efficiency (KGE) to improve flow variability in the runoff simulations optimized with the NSE or KGE. However, the LME criterion can potentially pose serious challenges to comparative performance evaluation and reliable design flow estimation due to the underdetermined solutions towards the excessive flow variation. For a complementary approach to the limitations of the NSE, KGE, and LME, this study has therefore proposed a new rebalanced performance criterion based on the least-squares regression components combined from both-way regression analysis between simulations and observations. It has been illustrated that the proposed criterion can provide a rebalanced trade-off between the constituent metric components leading to the improved flow variability with taking advantages of the KGE through the theoretical comparative analysis and the case study for long-term weekly streamflow time series from the LSM simulations in the four study watersheds with natural unregulated flow observations.
Read full abstract