Abstract

<p>As hydrologists, we pride ourselves on being able to identify deficiencies of a hydrologic model by looking at its runoff simulations. Generally, one of the first questions that a practicing hydrologist always asks when presented with a new model is: "show me some hydrographs!". Everyone has an intuition about how a "real" (i.e., observed) hydrograph should behave [1, 2]. Although there exists a large suite of summary metrics that measure differences between simulated and observed hydrographs, those metrics do not always fully account for our professional intuition about what constitutes an adequate hydrological prediction (perhaps because metrics typically aggregate over many aspects of model performance). To us, this suggests that either (a) there is potential to improve existing metrics to conform better with expert intuition, or (b) our expert intuition is overvalued and we should focus more on metrics, or (c) a bit of both.</p><p>In the social study proposed here, we aim to address this issue in a data-driven fashion: We will ask experts to access a website where they are tasked to compare two unlabeled hydrographs (at the same time) against an observed hydrograph, and to decide which of the unlabeled ones they think matches the observations better. Together with information about the experts’ background expertise, the collected responses should help paint a more nuanced picture of the aspects of hydrograph behavior that different members of the community consider important. This should provide valuable information that may enable us to derive new (and hopefully better) model performance metrics in a data-driven fashion directly from human ratings.</p><p> </p><p>[1] Crochemore, Louise, et al. "Comparing expert judgement and numerical criteria for hydrograph evaluation." <em>Hydrological sciences journal</em> 60.3 (2015): 402-423.</p><p>[2] Wesemann, Johannes, et al. "Man vs. Machine: An interactive poll to evaluate hydrological model performance of a manual and an automatic calibration." <em>EGU General Assembly Conference Abstracts.</em> 2017.</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call