Linear frequency modulation (LFM) signals are pivotal in radar systems, enabling high-resolution measurements and target detection. However, these signals are often degraded by noise, significantly impacting their processing and interpretation. Traditional denoising methods, including wavelet-based techniques, have been extensively used to address this issue, yet they often fall short in terms of optimizing performance due to fixed parameter settings. This paper introduces an innovative approach by combining wavelet denoising with long short-term memory (LSTM) networks specifically tailored for LFM signals in radar systems. By generating a dataset of LFM signals at various signal-to-noise Ratios (SNR) to ensure diversity, we systematically identified the optimal wavelet parameters for each noisy instance. These parameters served as training labels for the proposed LSTM-based architecture, which learned to predict the most effective denoising parameters for a given noisy LFM signal. Our findings reveal a significant enhancement in denoising performance, attributed to the optimized wavelet parameters derived from the LSTM predictions. This advancement not only demonstrates a superior denoising capability but also suggests a substantial improvement in radar signal processing, potentially leading to more accurate and reliable radar detections and measurements. The implications of this paper extend beyond modern radar applications, offering a framework for integrating deep learning techniques with traditional signal processing methods to optimize performance across various noise-dominated domains.
Read full abstract