Abstract

We propose a data-driven, model-free, way to reduce the noise of covariance matrices of time-varying systems. If the true covariance matrix is time-invariant, non-linear shrinkage of the eigenvalues is known to yield the optimal estimator for large matrices. Such a method outputs eigenvalues that are highly dependent on the inputs, as common sense suggests. When the covariance matrix is time-dependent, we show that it is generally better to use the set of eigenvalues that encode the average influence of the future on present eigenvalues resulting in a set of time-independent average eigenvalues. This situation is widespread in nature, one example being financial markets, where non-linear shrinkage remains the gold-standard filtering method. Our approach outperforms non-linear shrinkage both for the Frobenius norm distance, which is the typical loss function used for covariance filtering and for financial portfolio variance minimization, which makes our method generically relevant to many problems of multivariate inference. Further analysis of financial data suggests that the expected overlap between past eigenvectors and future ones is systematically overestimated by methods designed for constant covariances matrices. Our method takes a simple empirical average of the eigenvector overlap matrix, which is enough to outperform non-linear shrinkage.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call