Abstract

As climate change unfolds, extreme weather events and natural hazards relevant for the insurance industry may change in frequency and intensity. In order to guarantee risk-adequate pricing, risk modelers have to ask themselves the question whether a certain ‘hazard’ parameter based on past observations and calibrated on past data is still valid or needs to be updated. Using the example of heavy rainfall between 3 h and 72 h as a proxy for flash flooding, we apply a unique high-resolution single model initial-condition large ensemble. We outline a methodology by which risk modelers can assess whether, where and by how much updating the hazard component of their risk model is needed. Therefore, we compare two time periods: 1980–1999 (as an example for a typical baseline period) and 2015–2034 (representing a 20-year period centered around today). We argue that assessing changes over the full ensemble space is vital for (i) the identification of homogeneous regions where a certain signal emerges, and (ii) the quantification of risk changes in the tail of the extreme value distribution that may still be hidden in the mean response to climate change. In the example case of 3-hourly 50-year rainfall return levels, we find a significant increase between 1980 and 1999 and 2015–2034 over 44 % of European land area. We also identify specific risk regions such as northwestern Spain where changes in the very tail (100-year rainfall return level) already emerge before more common extremes (1-year rainfall return level). Failing to detect and consider these tail changes or assuming equal changes for common and rare extremes may therefore lead to an under- or overestimation of the true level of risk today.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call