Abstract

Modeling the covariance matrix of multivariate longitudinal data is more challenging as compared to its univariate counterpart due to the presence of correlations among multiple responses. The modified Cholesky block decomposition reduces the task of covariance modeling into parsimonious modeling of its two matrix factors: the regression coefficient matrices and the innovation covariance matrices. These parameters are statistically interpretable, however ensuring positive-definiteness of several (innovation) covariance matrices presents itself as a new challenge. We address this problem using a subclass of Anderson's (1973) linear covariance models and model several covariance matrices using linear combinations of known positive-definite basis matrices with unknown non-negative scalar coefficients. A novelty of this approach is that positive-definiteness is guaranteed by construction; it removes a drawback of Anderson's model and hence makes linear covariance models more realistic and viable in practice. Maximum likelihood estimates are computed using a simple iterative majorization-minimization algorithm. The estimators are shown to be asymptotically normal and consistent. Simulation and a data example illustrate the applicability of the proposed method in providing good models for the covariance structure of a multivariate longitudinal data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call