AbstractMicroseismic datasets typically have relatively low signal‐to‐noise ratio waveforms. To that end, several noise suppression techniques are often applied to improve the signal‐to‐noise ratio of the recorded waveforms. We apply a linear geometric mode decomposition approach to microseismic datasets for background noise suppression. The geometric mode decomposition method optimizes linear patterns within amplitude–frequency modulated modes and can efficiently distinguish microseismic events (signal) from the background noise. This method can also split linear and non‐linear dispersive seismic events into linear modes. The segmented events in different modes can then be added carefully to reconstruct the denoised signal. The application of geometric mode decomposition is well suited for microseismic acquisitions with smaller receiver spacing, where the signal may exhibit either (nearly) linear or non‐linear recording patterns, depending on the source location relative to the receiver array. Using synthetic and real microseismic data examples from limited‐aperture downhole recordings only, we show that geometric mode decomposition is robust in suppressing the background noise from the recorded waveforms. We also compare the filtering results from geometric mode decomposition with those obtained from FX‐Decon and one‐dimensional variational mode decomposition methods. For the examples used, geometric mode decomposition outperforms both FX‐Decon and one‐dimensional variational mode decomposition in background noise suppression.
Read full abstract