Abstract

Specific emitter identification task often encounters the problem of channel inconsistency between the collected signal and the signal to be recognised, that is, different channel environments would cause inconsistency in the sample distribution, which would result in a significant decrease in recognition accuracy; the same problem is encountered in transfer learning. The authors focus on an important category of methods in transfer learning called domain adaptation to solve the above problems. Motivated by the idea of the Fisher discriminant and metric learning, a novel domain adaptation method called the multi-discrepancy deep adaptation network is proposed. In contrast to the current approach of reducing the discrepancies between the source and target domain distributions without considering category information, the basic idea of the proposed method is to increase interclass discrepancies between the source and target domains while decreasing interclass discrepancies. To adjust the weights of interclass and interclass discrepancies, a dynamic balancing factor that adaptively adjusts the weights between the interclass variance and interclass variance according to the latest trends of both variances is introduced, which is effective in improving performance. Dynamic balancing and learning weight factors, which are used to adjust the training weights of interclass and interclass discrepancies during the training process are introduced. The authors applied the network to a specific emitter identification problem under different channels on the ORACLE radio frequency Fingerprinting Data set and demonstrated a maximum domain adaptability increase of 15.4%, which outperformed existing domain adaptation methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.