Abstract

High signal-to-noise ratio magnetotelluric (MT) data are crucial for accurately interpreting subsurface structures. Recently, deep learning has become popular for MT denoising due to its ability to avoid parameter tuning and enable real-time processing. These methods typically fit or predict signals in noisy segments after identifying and segmenting signal and noise in the time domain. However, these methods struggle to preserve both low- and high-frequency signals effectively due to high noise levels in these segments. To address this issue, we propose a novel deep learning denoising method that separately recovers low- and high-frequency signals using distinct strategies. Low-frequency signals are fitted using an inverse autoencoder with a channel attention mechanism, effectively removing high-frequency components. High-frequency signals are then predicted using a bidirectional long short-term memory network (BiLSTM) combined with a squeeze-and-excitation (SE) mechanism, enhancing prediction by considering both global and local signal characteristics. Additionally, we introduce the multivariate state estimation technique (MSET) for real-time signal-noise identification. MSET analyzes residuals after separating low-frequency signals to identify noise. Denoising is performed only on segments with significant noise, preserving more effective signals. Finally, the fitted low-frequency dominant component and predicted high-frequency component are combined to form the denoised MT signals. This combined approach significantly improves the restoration quality of effective signals compared to existing methods. Experimental results demonstrate that our method exhibits superior denoising capabilities in both quantitative and qualitative evaluations, including apparent resistivity-phase curves and polarization direction analysis, offering enhanced performance over current deep learning methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.