Abstract

Note. The best result in each experiment is highlighted in bold.The optimal solutions of many decision problems such as the Markowitz portfolio allocation and the linear discriminant analysis depend on the inverse covariance matrix of a Gaussian random vector. In “Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator,” Nguyen, Kuhn, and Mohajerin Esfahani propose a distributionally robust inverse covariance estimator, obtained by robustifying the Gaussian maximum likelihood problem with a Wasserstein ambiguity set. In the absence of any prior structural information, the estimation problem has an analytical solution that is naturally interpreted as a nonlinear shrinkage estimator. Besides being invertible and well conditioned, the new shrinkage estimator is rotation equivariant and preserves the order of the eigenvalues of the sample covariance matrix. If there are sparsity constraints, which are typically encountered in Gaussian graphical models, the estimation problem can be solved using a sequential quadratic approximation algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call