Abstract

SUMMARYTesting with synthetic data sets is a vital stage in an algorithm’s development for benchmarking the algorithm’s performance. A common addition to synthetic data sets is White, Gaussian Noise (WGN) which is used to mimic noise that would be present in recorded data sets. The first section of this paper focuses on comparing the effects of WGN and realistic modelled noise on standard microseismic event detection and imaging algorithms using synthetic data sets with recorded noise as a benchmark. The data sets with WGN underperform on the trace-by-trace algorithm while overperforming on algorithms utilizing the full array. Throughout, the data sets with realistic modelled noise perform near identically to the recorded noise data sets. The study concludes by testing an algorithm that simultaneously solves for the source location and moment tensor of a microseismic event. Not only does the algorithm fail to perform at the signal-to-noise ratios indicated by the WGN results but the results with realistic modelled noise highlight pitfalls of the algorithm not previously identified. The misleading results from the WGN data sets highlight the need to test algorithms under realistic noise conditions to gain an understanding of the conditions under which an algorithm can perform and to minimize the risk of misinterpretation of the results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call