Abstract

[1] Modeling streamflow hydrographs can be a highly complex problem, particularly due to difficulties caused by multiple dominant streamflow states, switching of dominant streamflow generation mechanisms temporally, and dynamic catchment responses to precipitation inputs based on antecedent conditions. Because of these complexities and the extreme heterogeneity that can exist within a single catchment, model calibration techniques are generally required to obtain reasonable estimates of the model parameters. Models are typically calibrated such that a best fit is determined over the entire period of simulation. In this way, each time step explicitly carries equal weight during the calibration process. Data transformations (e.g., logarithmic or square root) are a common way of modifying the calibration process by scaling the magnitude of the observations. Here we consider a data transformation that is focused on the time domain rather than the data domain. This approach, previously employed in transit time modeling literature, conceptually stretches time during high streamflows and compresses it during low streamflow periods, dynamically weighting streamflows in the time domain. The transformation, known as flow-corrected time, is designed to provide greater weight to time periods with larger hydrologic flux. Here the flow-corrected time transformation is compared to a baseline untransformed case and the commonly employed logarithmic transformation. Considering both visual and numerical (Nash-Sutcliffe efficiency) assessments, we demonstrate that over the time periods that dominate hydrologic flux the flow-corrected time transformation resulted in improved fits to the observed hydrograph.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call