Abstract

Research Article| February 18, 2014 Adaptive Smoothing of Seismicity in Time, Space, and Magnitude for Time‐Dependent Earthquake Forecasts for California Agnès Helmstetter; Agnès Helmstetter aISTerre, Université de Grenoble 1, CNRS, BP 53, F‐38041 Grenoble, Franceagnes.helmstetter@ujf-grenoble.fr Search for other works by this author on: GSW Google Scholar Maximilian J. Werner Maximilian J. Werner bDepartment of Geosciences, Princeton University, Princeton, New Jersey 08544mwerner@princeton.edu Search for other works by this author on: GSW Google Scholar Bulletin of the Seismological Society of America (2014) 104 (2): 809–822. https://doi.org/10.1785/0120130105 Article history first online: 14 Jul 2017 Cite View This Citation Add to Citation Manager Share Icon Share MailTo Twitter LinkedIn Tools Icon Tools Get Permissions Search Site Citation Agnès Helmstetter, Maximilian J. Werner; Adaptive Smoothing of Seismicity in Time, Space, and Magnitude for Time‐Dependent Earthquake Forecasts for California. Bulletin of the Seismological Society of America 2014;; 104 (2): 809–822. doi: https://doi.org/10.1785/0120130105 Download citation file: Ris (Zotero) Refmanager EasyBib Bookends Mendeley Papers EndNote RefWorks BibTex toolbar search Search Dropdown Menu toolbar search search input Search input auto suggest filter your search All ContentBy SocietyBulletin of the Seismological Society of America Search Advanced Search Abstract We present new methods for short‐term earthquake forecasting that employ space, time, and magnitude kernels to smooth seismicity. These methods are purely statistical and rely on very few assumptions about seismicity. In particular, we do not use Omori–Utsu law, and only one of our two new models assumes a Gutenberg–Richter law to model the magnitude distribution; the second model estimates the magnitude distribution nonparametrically with kernels. We employ adaptive kernels of variable bandwidths to estimate seismicity in space, time, and magnitude bins. To project rates over short time scales into the future, we simply assume persistence, that is, a constant rate over short time windows. The resulting forecasts from the two new kernel models are compared with those of the epidemic‐type aftershock sequence (ETAS) model generated by Werner et al. (2011). Although our new methods are simpler and require fewer parameters than ETAS, the obtained probability gains are surprisingly close. Nonetheless, ETAS performs significantly better in most comparisons, and the kernel model with a Gutenberg–Richter law attains larger gains than the kernel model that nonparametrically estimates the magnitude distribution. Finally, we show that combining ETAS and kernel model forecasts, by simply averaging the expected rate in each bin, can provide greater predictive skill than ETAS or the kernel models can achieve individually. You do not have access to this content, please speak to your institutional administrator if you feel you should have access.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call