Abstract

Amidst the evolving landscape of non-cooperative communication, automatic modulation classification (AMC) stands as an essential pillar, enabling adaptive and reliable signal processing. Due to the advancement of deep learning (DL) technology, neural networks have found application in AMC. However, the previous DL models face the inter-class confusion problem in high-order modulations. To address this issue, we propose a multitask-learning-empowered hybrid neural network, named CrossTLNet. Specifically, after the signal enters the model, it is first transformed into two task components: in-phase/quadrature (I/Q) form and amplitude/phase (A/P) form. For each task, we design a method that combines a temporal convolutional network (TCN) with a long short-term memory (LSTM) network to effectively capture long-term dependency features in high-order modulations. To enable interaction between these two different dimensional features, we innovatively introduce a cross-attention method, thereby further enhancing the model’s ability to distinguish signal features. Moreover, we also design a simple and efficient knowledge distillation method to reduce the size of CrossTLNet, making it easier to deploy in real-time or resource-limited scenarios. The experimental results indicate that the suggested method exhibits exceptional performance in AMC on public benchmarks, especially in high-order modulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call