Abstract

Adaptive beamforming techniques are now widely used to reject interference (jammer/clutter) signals in radar, communication, and sonar applications. In adaptive arrays using the sample matrix inversion (SMI) algorithm, inadequate estimation of the covariance matrix results in adaptive antenna patterns with high sidelobes and distorted mainbeams. In this paper, a method is proposed to precisely control the peak (rather than average) sidelobe level of adaptive array patterns. The proposed method is also generalized to adaptive array antennas with moderate bandwidth and large random amplitude and phase errors. Theoretical analysis and simulation results are provided to illustrate the performance of the method proposed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call