Abstract

Gaussian Process Functional Regression (GPFR) is a powerful tool in functional data analysis. In practical applications, functional data may be generated from different signal sources, and a single GPFR is not flexible enough to accurately model the data. To tackle the heterogeneity problem, a finite mixture of Gaussian Process Functional Regressions (mix-GPFR) was suggested. However, the number of components in mix-GPFR needs to be specified a priori, which is difficult to determine in practice. In this paper, we propose a Dirichlet Process Mixture of Gaussian Process Functional Regressions (DPM-GPFR), in which there are potentially infinite many GPFR components dominated by a Dirichlet process. Thus, DPM-GPFR is far more flexible than a single GPFR, and sidestep the model selection problem in mix-GPFR. We further develop a fully Bayesian treatment for learning DPM-GPFR based on the Variational Expectation-Maximization (VEM) algorithm. Experimental results on both synthetic datasets and real-world datasets demonstrate the effectiveness of our proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call