Abstract

Microseismic technology has widely been used in many rock engineering applications to shield workers from engineering hazards and monitor underground construction. To avoid the heavy workloads imposed by the manual recognition of many microseismic signals, this study proposes a new end-to-end training network architecture to automatically identify microseismic events. A dataset including not only easily identifiable microseismic signals but also barely distinguishable nontypical data has been collected from a practical rock engineering project for training and testing the network model. The applicability of various networks for this task is discussed to select the best method for microseismic recognition. We modify the residual skip connections to make them more suitable for the signal classification task. Then, the novel depthwise spatial and channel attention (DSCA) module is proposed. This module can autonomously learn how to weight information with different levels of importance, similar to human attention, which greatly improves the network performance without incurring additional computational costs. Theoretically, it can be a useful tool to replace traditional denoising algorithms and model the interdependencies between the different channels of a multichannel signal. Furthermore, the DSCA module and the modified residual connections are combined with a traditional convolutional network to obtain a novel network architecture named ResSCA and the results of comparative experiments are presented. Finally, single- and multichannel models are constructed based on ResSCA, which achieved improved accuracy rates. Their advantages and drawbacks are analyzed. This study presents a modified network architecture suitable for identifying and classifying complex signals to enable intelligent microseismic monitoring, which is valuable for various rock engineering applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.