Abstract

An important task in the analysis of multivariate data is testing of the covariance matrix structure. In particular, for assessing separability, various tests have been proposed. However, the development of a method of measuring discrepancy between two covariance matrix structures, in relation to the study of the power of the test, remains an open problem. Therefore, a discrepancy measure is proposed such that for two arbitrary alternative hypotheses with the same value of discrepancy, the power of tests remains stable, while for increasing discrepancy the power increases. The basic hypothesis is related to the separable structure of the observation matrix under a doubly multivariate normal model, as assessed by the likelihood ratio and Rao score tests. It is shown that the particular one-parameter method and the Frobenius norm fail in the power analysis of tests, while the entropy and quadratic loss functions can be efficiently used to measure the discrepancy between separable and non-separable covariance structures for a multivariate normal distribution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call