Abstract

High dimensionality causes problems in various areas of statistics. A particular situation that rarely has been considered is the testing of hypotheses about multivariate regression models in which the dimension of the multivariate response is large. In this article a ridge regularization approach is proposed in which either the covariance or the correlation matrix is regularized to ensure nonsingularity irrespective of the dimensionality of the data. It is shown that the proposed approach can be derived through a penalized likelihood approach, which suggests cross-validation of the likelihood function as a natural approach for estimation of the ridge parameter. Useful properties of this likelihood estimator are derived, discussed, and demonstrated by simulation. For a class of test statistics commonly used in multivariate analysis, the proposed regularization approach is compared with some obvious alternative regularization approaches using generalized inverse and data reduction through principal components analysis. Essentially, the approaches considered differ in how they shrink eigenvalues of sample covariance and correlation matrices. This leads to predictable differences in power properties when comparing the use of different regularization approaches, as demonstrated by simulation. The proposed ridge approach has relatively good power compared with the alternatives considered. In particular, a generalized inverse is shown to perform poorly and cannot be recommended in practice. Finally, the proposed approach is used in analysis of data on macroinvertebrate biomasses that have been classified to species.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call