Abstract

In this paper, we study a hypothesis testing problem of, among n random variables, determining k random variables which have different probability distributions from the rest (n - k) random variables. Instead of using separate measurements of each individual random variable, we propose to use mixed measurements which are functions of multiple random variables. It is demonstrated that O(klog(n)/minPi, PjC(Pi, Pj)) equation observations are sufficient for correctly identifying the k anomalous random variables with a high probability, where C(Pi, Pj) is the Chernoff information between two possible distributions Pi and Pj for the proposed mixed observations. We characterize the Chernoff information under fixed time-invariant mixed observations, random time-varying mixed observations and deterministic time-varying mixed observations respectively. For time-varying measurements, we introduce inner and outer conditional Chernoff information in our derivations. We demonstrate that mixed observations can strictly improve the error exponent of the hypothesis testing, over separate observations of individual random variables. These results imply that mixed observations of random variables can reduce the number of required samples in hypothesis testing applications. In contrast to the compressed sensing problems, this paper considers random variables whose values changes in different measurements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call