Abstract

Error contaminated linear approximation problems appear in a large variety of applications. The presence of redundant or irrelevant data complicates their solution. It was shown that such data can be removed by the core reduction yielding a minimally dimensioned subproblem called the core problem. Direct (SVD or Tucker decomposion-based) reduction has been introduced previously for problems with matrix models and vector, or matrix, or tensor observations; and also for problems with bilinear models. For the cases of vector and matrix observations a Krylov subspace method, the generalized Golub–Kahan bidiagonalization, can be used to extract the core problem. In this paper, we first unify previously studied variants of linear approximation problems under the general framework of a multilinear approximation problem. We show how the direct core reduction can be extended to it. Then we show that the generalized Golub–Kahan bidiagonalization yields the core problem for any multilinear approximation problem. This further allows one to prove various properties of core problems, in particular, we give upper bounds on the multiplicity of singular values of reduced matrices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call