Abstract

The total least squares (TLS) represents a popular data fitting approach for solving linear approximation problems Ax≈b (i.e., with a vector right-hand side) and AX≈B (i.e., with a matrix right-hand side) contaminated by errors. This paper introduces a generalization of TLS formulation to problems with structured right-hand sides. First, we focus on the case, where the right-hand side and consequently also the solution are tensors. We show that whereas the basic solvability result can be obtained directly by matricization of both tensors, generalization of the core problem reduction is more complicated. The core reduction allows to reduce mathematically the problem dimensions by removing all redundant and irrelevant data from the system matrix and the right-hand side. We prove that the core problems within the original tensor problem and its matricized counterpart are in general different. Then, we concentrate on problems with even more structured right-hand sides, where the same model A corresponds to a set of various tensor right-hand sides. Finally, relations between the matrix and tensor core problem are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call