Abstract

We provide a computational framework for approximating a class of structured matrices; here, the term structure is very general, and may refer to a regular sparsity pattern (e.g., block-banded), or be more highly structured (e.g., symmetric block Toeplitz). The goal is to uncover {\it additional latent structure} that will in turn lead to computationally efficient algorithms when the new structured matrix approximations are employed in the place of the original operator. Our approach has three steps: map the structured matrix to tensors, use tensor compression algorithms, and map the compressed tensors back to obtain two different matrix representations -- sum of Kronecker products and block low-rank format. The use of tensor decompositions enables us to uncover latent structure in the problem and leads to compressed representations of the original matrix that can be used efficiently in applications. The resulting matrix approximations are memory efficient, easy to compute with, and preserve the error that is due to the tensor compression in the Frobenius norm. Our framework is quite general. We illustrate the ability of our method to uncover block-low-rank format on structured matrices from two applications: system identification, space-time covariance matrices. In addition, we demonstrate that our approach can uncover sum of structured Kronecker products structure on several matrices from the SuiteSparse collection. Finally, we show that our framework is broad enough to encompass and improve on other related results from the literature, as we illustrate with the approximation of a three-dimensional blurring operator.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call