Abstract

A burst waveform is a finite sequence of pulses with a staggered PRF. It is used as a high-resolution radar waveform. The ambiguity function of a burst waveform has a good peak-to-sidelobe ratio along the range axis. But, along the Doppler axis, its peak-to-sidelobe ratio is not nearly as good. A mismatched receiving filter is the logical way to increase the peak-to-sidelobe ratio of the ambiguity function along the Doppler axis. Taylor weighting only suppresses the Doppler sidelobes that are close to the main peak. In this paper, we derive, by the techniques of nonlinear programming, an iterative method for calculating a mismatched filter that is optimum in the following sense. Let an interval <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">D</tex> of the Doppler axis be specified, as well as a desired peak-to-sidelobe ratio W. Then our method will calculate the mismatched filter with optimum signal-to-noise ratio that reduces the Doppler sidelobes to the specified level over the specified interval, if such a filter exists. If such a filter does not exist, then the calculated filter will still tend to suppress the sidelobes over the specified interval of the Doppler axis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.