Abstract

In this paper, we consider multivariate response regression models with high dimensional predictor variables. One way to estimate the coefficient matrix is through its decomposition. Among various decomposition of the coefficient matrix, we focus on the decomposition which leads to the best approximation to the signal part in the response vector given any rank. Finding this decomposition is equivalent to performing a principal component analysis for the signal. Given any rank, this decomposition has nearly the smallest expected prediction error among all decompositions of the coefficient matrix with the same rank. To estimate the decomposition, we solve a penalized generalized eigenvalue problem followed by a least squares procedure. In the high-dimensional setting, allowing a general covariance structure for the noise vector, we establish the oracle inequalities for the estimates. Simulation studies and application to real data show that the proposed method has good prediction performance and is efficient in dimension reduction for various models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call