Abstract

For multichannel airborne radars, wide-area ground-moving target indication (WGMTI) processing can quickly obtain the dynamic distribution of moving targets in a wide area, which holds considerable significance in many fields. Nevertheless, the WGMTI mode suffers from the interference of powerful ground clutter, which frequently submerges slow-moving targets and causes many false alarms in subsequent moving target detection. Space–time adaptive processing (STAP) can successfully suppress clutter, but its performance depends critically on the available training samples. Consequently, an effective STAP method characterized by fast processing and a small sample size for WGMTI application in multichannel airborne radars must be developed. In this paper, a subarray-level sparse recovery STAP (SR-STAP) processing framework is proposed for multichannel airborne radars. First, the characteristics of the subarray-level received clutter are discussed in detail. Second, on the basis of this analysis, we further designed a joint space–time dictionary and developed a separable tensor-based sparse Bayesian learning (STSBL) method. In this method, two-stage decomposition is proposed to ensure that large-scale data can be degraded into small-scale data in processing, which significantly improves computation efficiency. Finally, the effectiveness of the proposed STSBL-STAP method in WGMTI processing was verified using real measurement data obtained from a developed dual-channel Ku-band airborne radar.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.