Abstract

Water resources numerical models are dependent upon various input hydrologic field data. As models become increasingly complex and model simulation times expand, it is critical to understand the inherent value in using different input datasets available. One important category of model input is precipitation data. For hydrologic models, the precipitation data inputs are perhaps the most critical. Common precipitation model input includes either rain gauge or remotely-sensed data such next-generation radar-based (NEXRAD) data. NEXRAD data provides a higher level of spatial resolution than point rain gauge coverage, but is subject to more extensive data pre and post processing along with additional computational requirements. This study first documents the development and initial calibration of a HEC-HMS model of a subtropical watershed in the Upper St. Johns River Basin in Florida, USA. Then, the study compares calibration performance of the same HEC-HMS model using either rain gauge or NEXRAD precipitation inputs. The results are further discretized by comparing key calibration statistics such as Nash–Sutcliffe Efficiency for different spatial scale and at different rainfall return frequencies. The study revealed that at larger spatial scale, the calibration performance of the model was about the same for the two different precipitation datasets while the study showed some benefit of NEXRAD for smaller watersheds. Similarly, the study showed that for smaller return frequency precipitation events, NEXRAD data was superior.

Highlights

  • Computer models that simulate hydrologic runoff processes are essential tools for understanding and describing the overall hydrologic cycle

  • Precipitation measurements from rain gauges or meteorological stations have been used as the only reliable source of precipitation in watershed modeling [2]

  • The benefit of rain gauges is their ability to obtain a precise point value for precipitation, with minimal data processing needed for use in hydrologic model applications

Read more

Summary

Introduction

Computer models that simulate hydrologic runoff processes are essential tools for understanding and describing the overall hydrologic cycle. They are routinely used for important studies regarding water management, water quality issues, land use changes, flood inundation, and many other forecasting applications. Some researchers believe spatial and temporal variability of precipitation data are the main source of input data uncertainty when rainfall–runoff models are applied [1]. Proceedings 2019, 7, 11 mechanism for hydrologic models, selecting the suitable meteorological input dataset for precipitation is perhaps the most critical step in model development. The benefit of rain gauges is their ability to obtain a precise point value for precipitation, with minimal data processing needed for use in hydrologic model applications. A commonly used method, the Thiessen Polygon method, calculates the weight of each rain gauge according to the rain gauge location to create a polygon network and applies the gauge rainfall quantity uniformly over the polygon area

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.