Abstract

Flood warnings provide information about the timing and magnitude of impending floods, which can help mitigate the adverse impacts of flooding. Flood forecasts are highly influenced by uncertainty associated with rainfall forecasts as well as initial catchment wetness. Event-based models are simple and parsimonious and are widely favored by practitioners for flood estimation. However, these models require loss parameters to be manually specified for each simulated event, and this represents an additional source of uncertainty that needs to be considered along with errors in observations and rainfall forecasts. Little attention has been given to the coupling of updating techniques with event-based models to reduce the uncertainty associated with catchment wetness. To this end, we devised a sequential recalibration scheme to characterize uncertainty in ensemble forecasts derived using an event-based flood model. This scheme uses information on both observation and model errors to filter and update catchment loss estimates to improve the accuracy of the forecasts. Analysis of flood forecasts for 22 events showed that although initially, there was low skill in forecasts derived solely from external estimates of catchment wetness, the reliability and accuracy of the forecasts improved rapidly once the flood event commenced and the flood model was coupled with an updating scheme. Compared with forecasts made without any updating scheme, the conditioning and recalibration steps progressively improved the accuracy of the forecasts as measured by Nash-Sutcliffe efficiency from −0.14 to 0.88, bias was reduced by 78%, and root-mean square error reduced by 67%. The use of such schemes thus reinforces the advantages of using parsimonious models that have long been favored by practitioners for design and other purposes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call