Abstract

Machine learning techniques have received much attention in many areas for regression and classification tasks. In this paper, two new support vector regression (SVR) models, namely, least-square SVR and ε-SVR, are developed under the Bayesian inference framework with a square loss function and a ε-insensitive squared one respectively. In this framework, a Gaussian process prior is assigned to the regression function, and maximum posterior estimate of this function results in a support vector regression problem. The proposed method provides point-wise probabilistic prediction while keeps the structural risk minimization principle, and it allows us to determine the optimal hyper-parameters by maximizing Bayesian model evidence. Based on the Bayesian SVR model, an active learning algorithm is developed, and new training points are selected adaptively based on a learning function to update the SVR model progressively. Numerical results reveal that the developed two Bayesian SVR models are very promising for constructing accurate regression model for problems with diverse characteristics, especially for medium and high dimensional problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call