Abstract

We consider a fully Bayesian treatment of radial basis function regression, and propose a solution to the the instability of basis selection. Indeed, when bases are selected solely according to the magnitude of their posterior inclusion probabilities, it is often the case that many bases in the same neighborhood end up getting selected leading to redundancy and ultimately inaccuracy of the representation. In this paper, we propose a straightforward solution to the problem based on post-processing the sample path yielded by the model space search technique. Specically, we perform an a posteriori model-based clustering of the sample path via a mixture of Gaussians, and then select the points closer to the means of the Gaussians. Our solution is found to be more stable and yields a better performance on simulated and real tasks.

Highlights

  • We are given a training set D = {(xi, yi), i = 1, · · ·, n : xi ∈ X ⊂ IRp, yi ∈ IR}, where the yi’s are realizations of Yi = f ∗(xi) + i, with the i’s representing the noise terms, assumed to be independently normally distributed with mean 0 and variance σ2

  • We assume that H is a reproducing kernel Hilbert space (RKHS), meaning that H is a Hilbert space of functions f : X → IR equipped with a unique kernel K : X × X → IR, satisfying

  • Given the Gaussian likelihood provided by y ∼ N (Kw, σ2In), we seek to specify a prior over w such that the Bayesian estimator wof w is k-sparse, i.e. has only k nonzero entries. This problem of sparse Bayesian learning in the context of radial basis function networks has been scrutinized by several authors: Tipping (2001) has introduced and developed the Relevance Vector Machine (RVM)

Read more

Summary

Introduction

Given the Gaussian likelihood provided by y ∼ N (Kw, σ2In), we seek to specify a prior over w such that the Bayesian estimator wof w is k-sparse, i.e. has only k nonzero entries This problem of sparse Bayesian learning in the context of radial basis function networks has been scrutinized by several authors: Tipping (2001) has introduced and developed the Relevance Vector Machine (RVM). Despite providing a nice extension to its predecessor, Fokoue (2008) handling of non full rank data matrix cases produces model search results that tend yield rather unstable basis function selections.

Aspects of Model Space Search
Mixture Modeling of the Sample Path for Stability
Example of Fit Using Mixture Modelling
Conclusion and Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call