Abstract
Quantifying evaporative loss from reservoirs plays a critical role in sound water-availability management plans and in reservoir management. Various methods are used to quantify reservoir evaporation; however, each method carries a degree of uncertainty that propagates to model predictions of available water within a reservoir or a reservoir network. Herein, we explore the impact of uncertainty in reservoir evaporation on model outputs of historical and future water availability throughout the five major reservoirs in the Savannah River Basin in South Carolina, USA, using four different evaporation methods. Variability in the total available water is evaluated using the United States Army Corps of Engineers (USACE) 2006 Drought Contingency Plan hydrologic model of the Savannah River Basin, which incorporates recent water-management plans and reservoir controls. Results indicate that, during droughts, reservoir evaporation plays a large role in water-availability predictions, and uncertainty in evaporative losses produces significant uncertainty in modeled water availability for extreme events. For example, the return period for an event in which the availability of water in Lake Hartwell was reduced to 50% of full pool capacity varied from 38.2 years to 53.4 years, depending on the choice of evaporation parameterization. This is a variation of 40% in the return period, depending on the choice of evaporation method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.