Abstract
One of the most challenging tasks in computational science is the approximation of high-dimensional functions. Most of the time, only a few information on the functions is available, and approximating high-dimensional functions requires exploiting low-dimensional structures of these functions. In this work, the approximation of a function u is built using point evaluations of the function, where these evaluations are selected adaptively. Such problems are encountered when the function represents the output of a black-box computer code, a system or a physical experiment for a given value of a set of input variables. This algorithm relies on an extension of principal components analysis (PCA) to multivariate functions in order to estimate the tensors $v_{\alpha}$ . In practice, the PCA is realized on sample-based projections of the function u, using interpolation or least-squares regression. Least-squares regression can provide a stable projection but it usually requires a high number of evaluations of u, which is not affordable when one evaluation is very costly. In [1] the authors proposed an optimal weighted least-squares method, with a choice of weights and samples that garantee an approximation error of the order of the best approximation error using a minimal number of samples. We here present an extension of this methodology for the approximation in tree-based format, where optimal weighted least-squares method is used for the projection onto tensor product spaces. This approach will be compared with a strategy using standard least-squares method or interpolation (as proposed in [2]).
Highlights
We consider a pair of random variables (X, Y ) such that Y = u(X), where X = (X1, ..., Xd) is a random vector and u : X → R is a function
In the context of Uncertainty Quantification, Y is the output of a numerical code and X are the input parameters
For α ⊂ D = {1, ..., d}, u is identified with a bivariate function in L2μα ⊗ L2μαc
Summary
For α ⊂ D = {1, ..., d}, u is identified with a bivariate function in L2μα ⊗ L2μαc. T is a dimension partition tree, u has a representation in tree-based tensor format. The best approximation of u by a function with α-rank rα is the truncated singular value decomposition: rα urα (xα, xαc ) =. Extension to tree-based formats: From the leaves of the tree to the root, determine the subspaces Uα of principal components of uα = PVα u for all α, with PVα the orthogonal projection onto Uα ⊗ L2μα : if S(α) = ∅, Vα is a given approximation space if S(α) = ∅ Vα = ⊗β∈S(α)Uβ. PVwf = arg min wi|v(zi) − f (zi)|2 v∈V n i=1 with (zi)ni=1 i.i.d samples from the measure dρw.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.