Abstract
Wireless interference recognition (WIR) is one of the most indispensable technologies in non-cooperative communication systems. Recently, compared with convolutional networks, Transformers enjoy the ability of extracting global features and have achieved striking performance for WIR. However, the self-attention module in Transformers brings huge computational overhead that hinders deployment on resource-constrained devices. In this letter, we aim to alleviate the problem and propose WIR-Transformer. Specifically, WIR-Transformer introduces the division of regions and independently calculates self-attention in each region, which effectively reduces the complexity. To overcome the bottleneck of information blocking between regions, we propose an information exchange module (IEM). To further reduce the complexity, we introduce a novel patch aggregation module (PAM) to reduce the number of patches and fuse the local information. Experiments demonstrate that the proposed WIR-Transformer achieves higher accuracy as compared to conventional methods for WIR.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.