Abstract

In a typical radar system, the same hypothesis testing problem is periodically repeated to constantly monitor the scene: this results in a fundamental tradeoff between integration time and scan rate. In this paper, we propose a novel design criterion for radar systems, which carefully balances the contrasting requirements for a large probability of detection, a small probability of false alarm, and a short scan time. In particular, our goal is to maximize the detection rate, defined as the average number of detections per unit of time, under a constraint on the false alarm rate, defined as the average number of false alarms per unit of time. Some examples modeling situations commonly encountered in the radar applications are presented to illustrate the effects of this design philosophy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call