Abstract

We study a variant of the many-help one hypothesis testing against independence problem in which the source, not necessarily Gaussian, has finite differential entropy and the observation noises under the null hypothesis are Gaussian. Under the criterion that stipulates minimization of the Type II error exponent subject to a (constant) bound on the Type I error rate, we derive an upper bound on the exponent-rates function. The bound is shown to mirror a corresponding explicit lower bound, except that the lower bound involves the source power (variance) whereas the upper bound has the source entropy power. Part of the utility of the established bound is for investigating asymptotic exponent/rates and losses incurred by distributed detection as function of the number of observations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call