Abstract

In this paper, we consider a broad class of interpolation problems, for both scalar- and vector-valued multivariate functions subject to linear side conditions, such as being divergence-free, where the data are generated via integration against compactly supported distributions. We show that, by using certain families of matrix-valued conditionally positive definite functions, such interpolation problems are well poised; that is, the interpolation matrices are invertible. As a sample result, we show that a divergence-free vector field can be interpolated by a linear combination of convolutions of the data-generating distributions with a divergence-free, $3 \times 3$ matrix-valued conditionally positive definite function. In addition, we obtain norm estimates for inverses of interpolation matrices that arise in a class of multivariate Hermite interpolation problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call