Abstract

<p>Real-time monitoring is of primary importance for rapid and targeted emergency operations after potentially destructive earthquakes. A key aspect in determining the impact of an earthquake is the reconstruction of the ground shaking field, usually expressed as a peak ground parameter. Traditional algorithms approach this task by computing the ground shaking field from the punctual data at the stations and relying on ground motion prediction equations (GMPEs) computed on estimates of the earthquake location where the instrumental data are missing. The results of such algorithms are then subordinate to the evaluation of location and magnitude which can take several minutes.<br>To fill the gap between the arrival of the data and the (first preliminary) estimation (usually computed in a few minutes), we introduce a new data-driven algorithm that exploits the information from the station data only. Such an algorithm, consisting of an ensemble of convolutional neural networks (CNNs) trained with a database of ground shaking maps produced with traditional algorithms, can provide real-time estimates of the ground shaking field and the associated uncertainty. Since CNNs cannot handle sparse data a Voronoi tessellation of a specific peak ground parameter recorded at the stations is computed and used as an input to the CNNs; site effects and network geometry are accounted for using a (normalized) Vs30 map and a station location map, respectively.<br>The developed method is robust to noise, can handle network geometry changes over time without the need for retraining, and can resolve multiple simultaneous events. Although having a lower resolution, the results obtained are compatible with the ones from traditional methods. A fully-operational version of the algorithm is running on the servers at the Department of Mathematics and Geosciences of the University of Trieste showing real-time capabilities in handling stations from multiple Italian strong-motion networks and outputting results with a resolution of 0.05°x0.05°.</p>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.