Abstract

Modern high-resolution radars often employ planar phased array antennas. The patterns for these arrays are usually designed to have narrow mainbeams and low sidelobes. This is best achieved by minimizing the total sidelobe energy in the array factor, rather than setting a sidelobe threshold level as is usually done. A method to achieve the minimizing patterns is described here for both sum and difference array factors. We define the optimum array factor for a linear array as minimizing the total array factor energy outside of a specified angular range about the peak of the mainbeam. This optimization technique is at least as easy to apply as the older techniques and is superior for many applications. In addition, the optimization technique has far fewer restrictions on the placement of the elements and is, therefore, more general. We can easily generalize the method to include nonuniform array designs, however an initial look into the performance of nonuniform arrays gave disappointing results for the sum array factor.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call