Abstract

Tensor representations allow compact storage and efficient manipulation of multi-dimensional data. Based on these, tensor methods build low-rank subspaces for the solution of multi-dimensional and multi-parametric models. However, tensor methods cannot always be implemented efficiently, specially when dealing with non-linear models. In this paper, we discuss the importance of achieving a tensor representation of the model itself for the efficiency of tensor-based algorithms. We investigate the adequacy of interpolation rather than projection-based approaches as a means to enforce such tensor representation, and propose the use of cross approximations for models in moderate dimension. Finally, linearization of tensor problems is analyzed and several strategies for the tensor subspace construction are proposed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call