Abstract
Rare weather and climate events, such as heat waves and floods, can bring tremendous social costs. Climate data is often limited in duration and spatial coverage, and so climate forecasting has often turned to simulations of climate models to make better predictions of rare weather events. However very long simulations of complex models, in order to obtain accurate probability estimates, may be prohibitively slow. It is an important scientific problem to develop probabilistic and dynamical techniques to estimate the probabilities of rare events accurately from limited data. In this paper we compare four modern methods of estimating the probability of rare events: the generalized extreme value (GEV) method from classical extreme value theory; two importance sampling techniques, geneaological particle analysis (GPA) and the Giardina-Kurchan-Lecomte-Tailleur (GKLT) algorithm; as well as brute force Monte Carlo (MC). With these techniques we estimate the probabilities of rare events in three dynamical models: the Ornstein-Uhlenbeck process, the Lorenz ’96 system and PlaSim (a climate model). We keep the computational effort constant and see how well the rare event probability estimation of each technique compares to a gold standard afforded by a very long run control. Somewhat surprisingly we find that classical extreme value theory methods outperform GPA, GKLT and MC at estimating rare events.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.