Abstract

Gaussian processes (GPs) are very widely used for modeling of unknown functions or surfaces in applications ranging from regression to classification to spatial processes. Although there is an increasingly vast literature on applications, methods, theory and algorithms related to GPs, the overwhelming majority of this literature focuses on the case in which the input domain corresponds to a Euclidean space. However, particularly in recent years with the increasing collection of complex data, it is commonly the case that the input domain does not have such a simple form. For example, it is common for the inputs to be restricted to a non-Euclidean manifold, a case which forms the motivation for this article. In particular, we propose a general extrinsic framework for GP modeling on manifolds, which relies on embedding of the manifold into a Euclidean space and then constructing extrinsic kernels for GPs on their images. These extrinsic Gaussian processes (eGPs) are used as prior distributions for unknown functions in Bayesian inferences. Our approach is simple and general, and we show that the eGPs inherit fine theoretical properties from GP models in Euclidean spaces. We consider applications of our models to regression and classification problems with predictors lying in a large class of manifolds, including spheres, planar shape spaces, a space of positive definite matrices, and Grassmannians. Our models can be readily used by practitioners in biological sciences for various regression and classification problems, such as disease diagnosis or detection. Our work is also likely to have impact in spatial statistics when spatial locations are on the sphere or other geometric spaces.

Highlights

  • Over the past few decades, Gaussian process (GP) models have emerged as very powerful tools in many problems of statistics and machine learning

  • We refer to the resulting GPs as extrinsic GPs. eGPs are shown to inherit appealing properties of GPs defined on Euclidean spaces, and they adapt to the dimension of the manifolds instead of the dimension of the Euclidean space where the manifolds are embedded onto

  • We propose a general extrinsic framework for constructing Gaussian processes on manifolds for regression and classification with manifold-valued predictors

Read more

Summary

Introduction

Over the past few decades, Gaussian process (GP) models have emerged as very powerful tools in many problems of statistics and machine learning. An interesting exception is due to Yang and Dunson (2016) in which they show that by imposing a Gaussian process prior on the regression function with a covariance kernel defined directly on the ambient space, the posterior distribution yields a posterior contraction rate depending on the intrinsic dimension of the manifold. They assume that the unknown lower-dimensional space where the predictors center around are a class of submanifolds of Euclidean space.

Regression and classification on manifolds
Examples
Spheres
Landmark-based shape spaces Σk2
Diffusion tensor imaging and positive definite matrices
Properties of eGPs
Mean square differentiability
Posterior contraction rates of eGPs
Findings
Discussion and conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call