Abstract

Record low snowpack conditions were observed at Snow Telemetry stations in the Cascades Mountains, USA during the winters of 2014 and 2015. We tested the hypothesis that these winters are analogs for the temperature sensitivity of Cascades snowpacks. In the Oregon Cascades, the 2014 and 2015 winter air temperature anomalies were approximately +2 °C and +4 °C above the climatological mean. We used a spatially distributed snowpack energy balance model to simulate the sensitivity of multiple snowpack metrics to a +2 °C and +4 °C warming and compared our modeled sensitivities to observed values during 2014 and 2015. We found that for each +1 °C warming, modeled basin-mean peak snow water equivalent (SWE) declined by 22%–30%, the date of peak SWE (DPS) advanced by 13 days, the duration of snow cover (DSC) shortened by 31–34 days, and the snow disappearance date (SDD) advanced by 22–25 days. Our hypothesis was not borne out by the observations except in the case of peak SWE; other snow metrics did not resemble predicted values based on modeled sensitivities and thus are not effective analogs of future temperature sensitivities. Rather than just temperature, it appears that the magnitude and phasing of winter precipitation events, such as large, late spring snowfall, controlled the DPS, SDD, and DSC.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.