We propose a robust optimization (RO) framework that immunizes some of the central linear algebra problems in the presence of data uncertainty. Namely, we formulate linear systems, matrix inversion, eigenvalues–eigenvectors and matrix factorization under uncertainty, as robust optimization problems using appropriate descriptions of uncertainty. The resulting optimization problems are computationally tractable and scalable. We show in theory that RO improves the relative error of the linear system by reducing the condition number of the underlying matrix. Moreover, we provide empirical evidence showing that the proposed approach outperforms state of the art methods for linear systems and matrix inversion, when applied on ill-conditioned matrices. We show that computing eigenvalues–eigenvectors under RO, corresponds to solving linear systems that are better conditioned than the nominal and illustrate with numerical experiments that the proposed approach is more accurate than the nominal, when perturbing ill-conditioned matrices. Finally, we demonstrate empirically the benefit of the robust Cholesky factorization over the nominal.
Read full abstract