Abstract
Parametric models in vector spaces are shown to possess an associated linear map, leading directly to reproducing kernel Hilbert spaces and affine/linear representations in terms of tensor products. From this map, analogues of correlation operators can be formed such that the associated linear map factorises the correlation. Its spectral decomposition and the associated Karhunen-Loeve- or proper orthogonal decomposition in a tensor product follow directly, including an extension to continuous spectra. It is shown that all factorisations of a certain class are unitarily equivalent, as well as that every factorisation induces a different representation, and vice versa. No particular assumptions are made on the parameter set, other than that the vector space of real valued functions on this set allows an appropriate inner product on a subspace. A completely equivalent spectral and factorisation analysis can be carried out in kernel space. The relevance of these abstract constructions is shown on a number of mostly familiar examples, thus unifying many such constructions under one theoretical umbrella. From the factorisation, one obtains tensor representations, which may be cascaded, leading to tensors of higher degree. When carried over to a discretised level in the form of a model order reduction, such factorisations allow sparse low-rank approximations which lead to very efficient computations especially in high dimensions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.