Abstract

Large-margin techniques have been studied intensively by the machine learning community to balance the empirical error rate on the training set and the generalization ability on the test set. However, they have been mostly developed together with generic discriminative models such as support vector machines (SVMs) and are often difficult to apply in parameter estimation problems for generative models such as Gaussians and hidden Markov models. The difficulties lie in both the formulation of the training criteria and the development of efficient optimization algorithms. In this article, we consider the basic problem of large margin training of Gaussian models. We take the geometric perspective of separating patterns using concentric ellipsoids, a concept that has not generally been familiar to signal processing researchers but which we will elaborate on here. We describe the approach of finding the maximum-ratio separating ellipsoids (MRSEs) and derive an extension with soft margins. We show how to formulate the soft-margin MRSE problem as a convex optimization problem, more specifically a semidefinite program (SDP). In addition, we derive its duality theory and optimality conditions and apply this method to a vowel recognition example, which is a classical problem in signal processing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call