Abstract

The fiber-optic distributed acoustic sensor (DAS), which utilizes existing communication cables as its sensing media, plays an important role in urban infrastructure monitoring and natural disaster prediction. In the face of a wide, dynamic environment in urban areas, a fast, accurate DAS signal recognition method is proposed with an end-to-end attention-enhanced ResNet model. In preprocessing, an objective evaluation method is used to compare the distinguishability of different input features with the Euclidean distance between the posterior probabilities classified correctly and incorrectly; then, an end-to-end ResNet is optimized with the chosen time-frequency feature as input, and a convolutional block attention module (CBAM) is added, which can quickly focus on key information from different channels and specific signal structures and improves the system recognition performance further. The results show that the proposed ResNet+CBAM model has the best performance in recognition accuracy, convergence rate, generalization capability, and computational efficiency compared with 1-D CNN, 2-D CNN, ResNet, and 2-D CNN+CBAM. An average accuracy of above 99.014% can be achieved in field testing; while dealing with multi-scenario scenes and inconsistent laying or burying environments, it can still be kept above 91.08%. The time cost is only 3.3 ms for each signal sample, which is quite applicable in online long-distance distributed monitoring applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.