Abstract

When we use the normal mixture model, the optimal number of the components describing the data should be determined. Testing homogeneity is good for this purpose; however, to construct its theory is challenging, since the test statistic does not converge to the distribution even asymptotically. The reason for such asymptotic behavior is that the parameter set describing the null hypothesis (N.H.) contains singularities in the space of the alternative hypothesis (A.H.). Recently, a theory for singular models was developed, and it has elucidated various problems of statistical inference. However, its application to hypothesis tests has been limited. In this article, we tackled the problem of testing homogeneity using Bayesian singular learning theory, and derive the asymptotic distributions of the marginal likelihood ratios in three cases: (1) only the mixture ratio is a variable in the A.H.; (2) the mixture ratio and the mean of the mixed distribution are variables; and (3) the mixture ratio, the mean, and the variance of the mixed distribution are variables. These are obtained by applying the scaling technique to the Kullback–Leibler divergence between N.H. and A.H. The validity of the hypothesis tests based on the results is confirmed through numerical experiments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call