Abstract

AbstractNine distributed hydrological models, forced with common meteorological inputs, simulated naturalized daily discharge from the Thames basin for 1963–2001. While model-dependent evaporative losses are critical for modeling mean discharge, multiple physical processes at many time scales influence the variability and timing of discharge. Here the use of cross-spectral analysis is advocated to measure how the average amplitude—and independently, the average phase—of modeled discharge differ from observed discharge at daily to decadal time scales. Simulation of the spectral properties of the model discharge via numerical manipulation of precipitation confirms that modeled transformation involves runoff generation and routing that amplify the annual cycle, while subsurface storage and routing of runoff between grid boxes introduces most of the autocorrelation and delays. Too much or too little modeled evaporation affects discharge variability, as do the capacity and time constants of modeled stores. Additionally, the performance of specific models would improve if four issues were tackled: 1) nonsinusoidal annual variations in model discharge (prolonged low base flow and shortened high base flow; three models), 2) excessive attenuation of high-frequency variability (three models), 3) excessive short-term variability in winter half years but too little variability in summer half years (two models), and 4) introduction of phase delays at the annual scale only during runoff generation (three models) or only during routing (one model). Cross-spectral analysis reveals how reruns of one model using alternative methods of runoff generation—designed to improve performance at the weekly to monthly time scales—degraded performance at the annual scale. The cross-spectral approach facilitates hydrological model diagnoses and development.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call