Abstract

The problem of composite binary hypothesis testing of Markov forest (or tree) distributions is considered. The worst-case type-II error exponent is derived under the Neyman-Pearson formulation. Under simple null hypothesis, the error exponent is derived in closed-form and is characterized in terms of the so-called bottleneck edge of the forest distribution. The least favorable distribution for detection is shown to be Markov on the second-best max-weight spanning tree with mutual information edge weights. A necessary and sufficient condition to have positive error exponent is derived.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call