Abstract
We introduce a supervised learning framework for target functions that are well approximated by a sum of (few) separable terms. The framework proposes to approximate each component function by a B-spline, resulting in an approximant where the underlying coefficient tensor of the tensor product expansion has a low-rank polyadic decomposition parametrization. By exploiting the multilinear structure, as well as the sparsity pattern of the compactly supported B-spline basis terms, we demonstrate how such an approximant is well-suited for regression and classification tasks by using the Gauss–Newton algorithm to train the parameters. Various numerical examples are provided analyzing the effectiveness of the approach.
Highlights
Approximating multivariate functions in high dimensions quickly becomes infeasible due to the curse of dimensionality
Building on the results for the GNbased computation of a CPD using alternative cost functions (Vandecappelle et al, 2021), we show that our algorithm can be altered to accommodate logistic cost functions which are more suitable for classification problems
We have introduced a supervised learning framework for regression and classification tasks which aims to approximate target functions with a sum of separable terms
Summary
Approximating multivariate functions in high dimensions quickly becomes infeasible due to the curse of dimensionality. A common architecture, adapted by deep neural networks (Schmidhuber, 2015), is to express the approximant as a sequence of compositions of simpler functions. We study another commonly occuring structure in which the target function f (x) essentially has low rank and can be expressed as a sum of few separable terms, i.e., RD f (x) =. Apart from the former work on sums of separable functions (Beylkin et al, 2009; Garcke, 2010; Kargas and Sidiropoulos, 2021), the utility of other types of tensor decompositions have been studied in the literature.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.