Abstract

The smoothness of subdiagonals of the Cholesky factor of large covariance matrices is closely related to the degree of nonstationarity of autoregressive models for time series data. Heuristically, one expects for nearly stationary covariance matrix entries in each subdiagonal of the Cholesky factor of its inverse to be approximately the same in the sense that the sum of the absolute values of successive differences is small. Statistically, such smoothness is achieved by regularizing each subdiagonal using fused-type lasso penalties. We rely on the standard Cholesky factor as the new parameter within a regularized normal likelihood setup which guarantees: (a) joint convexity of the likelihood function, (b) strict convexity of the likelihood function restricted to each subdiagonal even when n < p, and (c) positive-definiteness of the estimated covariance matrix. A block coordinate descent algorithm, where each block is a subdiagonal, is proposed, and its convergence is established under mild conditions. Lack of decoupling of the penalized likelihood function into a sum of functions involving individual subdiagonals gives rise to some computational challenges and advantages relative to two recent algorithms for sparse estimation of the Cholesky factor, which decouple row-wise. Simulation results and real data analysis show the scope and good performance of the proposed methodology. Software for our method is freely available in R language. Supplementary materials for this article are available online.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call