Abstract

We study a nonparametric likelihood ratio test (NPLRT) for Gaussian mixtures. It is based on the nonparametric maximum likelihood estimator in the context of demixing. The test concerns if a random sample is from the standard normal distribution. We consider mixing distributions of unbounded support for alternative hypothesis. We prove that the divergence rate of the NPLRT under the null is bounded by $\log n$, provided that the support range of the mixing distribution increases no faster than $(\log n/\log 9)^{1/2}$. We prove that the rate of $\sqrt{\log n}$ is a lower bound for the divergence rate if the support range increases no slower than the order of $\sqrt{\log n}$. Implications of the upper bound for the rate of divergence are discussed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call