Abstract
With advancements in phased array and cognitive technologies, the adaptability of modern multifunction radars (MFRs) has significantly improved, enabling greater flexibility in waveform parameters and beam scheduling. However, these enhancements have made it increasingly difficult to establish fixed relationships between working modes using traditional radar recognition methods. Furthermore, conventional approaches often exhibit limited robustness and computational efficiency in complex or noisy environments. To address these challenges, this paper proposes a joint learning framework based on a hybrid model combining convolutional neural networks (CNNs) and Transformers for MFR working mode recognition. This hybrid model leverages the local convolution operations of the CNN module to extract local characters from radar pulse sequences, capturing the dynamic patterns of radar waveforms across different modes. Simultaneously, the multi-head attention mechanism in the Transformer module models long-range dependencies within the sequences, capturing the “semantic information” of waveform scheduling intrinsic to MFR behavior. By integrating characters across multiple levels, the hybrid model effectively recognizes MFR working modes. This study used the data of the Mercury MFR for modeling and simulation, and proved through a large number of experiments that the proposed hybrid model can achieve robust and reliable identification of advanced MFR working modes even in complex electromagnetic environments.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have