Abstract

We consider optimal decentralized (or equivalently, quantized) detection for the Neyman– Pearson, Bayes, Ali–Silvey distance, and mutual (Shannon) information criteria. In all cases, it is shown that the optimal sensor decision rules are quantizers that operate on the likelihood ratio of the observations. We further show that randomized fusion rules are suboptimal for the mutual information criterion. We also show that if the processes observed at the sensors are conditionally independent and identically distributed, and the criterion for optimization is either a member of a subclass of the Ali–Silvey distances or local mutual information, then the quantizers used at all of the sensors are identical. We give an example to show that for the Neyman–Pearson and Bayes criteria this is not generally true. We go into some detail with respect to this last, and derive necessary conditions for an assumptions of identical sensor quantizer maps to be reasonable.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call