Abstract
This manuscript presents an approach to perform generalized linear regression with multiple high dimensional covariance matrices as the outcome. In many areas of study, such as resting-state functional magnetic resonance imaging (fMRI) studies, this type of regression can be utilized to characterize variation in the covariance matrices across units. Model parameters are estimated by maximizing a likelihood formulation of a generalized linear model, conditioning on a well-conditioned linear shrinkage estimator for multiple covariance matrices, where the shrinkage coefficients are proposed to be shared across matrices. Theoretical studies demonstrate that the proposed covariance matrix estimator is optimal achieving the uniformly minimum quadratic loss asymptotically among all linear combinations of the identity matrix and the sample covariance matrix. Under certain regularity conditions, the proposed estimator of the model parameters is consistent. The superior performance of the proposed approach over existing methods is illustrated through simulation studies. Implemented to a resting-state fMRI study acquired from the Alzheimer's Disease Neuroimaging Initiative, the proposed approach identified a brain network within which functional connectivity is significantly associated with Apolipoprotein E ε4, a strong genetic marker for Alzheimer's disease.
Highlights
In this manuscript, we study a regression problem with covariance matrices as the outcome under a high dimensional setting
The CSCAP approach improves the performance with much lower mean squared error (MSE) in estimating the eigenvalues, and lower MSE and higher coverage probability (CP) in estimating the β coefficient
We introduce an approach to perform linear regression with multiple high dimensional covariance matrices as the outcome
Summary
We study a regression problem with covariance matrices as the outcome under a high dimensional setting. (1) This paper first studies a joint shrinkage estimator for multiple high dimensional covariance matrices, generalizing the linear shrinkage estimator for a single covariance matrix [24] We show that the latter approach is suboptimal compared to the proposed joint covariance shrinkage estimator, where the shrinkage coefficients are shared across multiple matrices. Within this class of shrinkage estimators, we believe that this is among the first attempts to analyze the variations of a large number of covariance matrices associated with covariates in a regression setting under certain model assumptions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have