Abstract

Over the past decades much effort has been devoted towards understanding and forecasting natural hazards. However, earthquake forecasting skill is still very limited and remains a great scientific challenge. The limited earthquake predictability is partly due to the erratic nature of earthquakes and partly to the lack of understanding the underlying mechanisms of earthquakes. To improve our understanding and potential forecasting, here we study the spatial and temporal long-term memory of interevent earthquakes above a certain magnitude using lagged conditional probabilities. We find, in real data, that the lagged conditional probabilities show long-term memory for both the interevent times and interevent distances and that the memory functions obey scaling and decay slowly with time, while, at a characteristic time, the decay crossesover to a fast decay. We also show that the ETAS model, which is often used to forecast earthquake events, yields scaling functions of the temporal and spatial interevent intervals which are not consistent with those of real data.

Highlights

  • The mechanisms of earthquakes are still not fully understood and remain a great scientific challenge [1]

  • We show that the epidemic-type aftershock sequence model, which is often used to forecast earthquake events, fails in reproducing the scaling function of real catalogs as well as the crossover in the scaling function

  • Using the Gutenberg-Richter law and the exponent of the Omori law, Bak et al [8] found that the probability density function (PDF) of interevent times for different magnitude thresholds and different spatial grid sizes can be rescaled into a single function

Read more

Summary

INTRODUCTION

The mechanisms of earthquakes are still not fully understood and remain a great scientific challenge [1]. Using the Gutenberg-Richter law and the exponent of the Omori law, Bak et al [8] found that the probability density function (PDF) of interevent times for different magnitude thresholds and different spatial grid sizes can be rescaled into a single function. This suggests a universal scaling law for earthquakes. The conditional probability method and DFA have been applied recently to study the memory in time series of consecutive interevent interval time series of real and ETAS model earthquake data [17]. We study the longterm memory by considering the lagged conditional probabilities of interevent times and distances in real earthquake catalogs. We study interevent times (as in previous studies) and interevent distances

MEMORY IN REAL SEISMIC CATALOGS
MEMORY IN THE ETAS MODEL
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.