Abstract

Discriminative models are preferred when training data is abundant, but researches show that when the data is limited, the generative models can achieve better performance. In this paper, a novel model named Bayes perceptron is proposed to take advantage of the generative and discriminative approaches. This model divides every feature vector into several subvectors, each of which is modeled on Bayes assumption. Then subvectors are combined by inducing a weight parameter vector. After some transforms, the weight parameters is fit discriminatively by a perceptron algorithm. Furthermore, we give detailed theoretical analysis and justification on the convergence and robustness to the inseperable data. As an important byproduct, an approach is discribed to generalize the binary classifier perceptron into multiclass classifer. Experimental evaluations on text classification tasks demonstrate that the proposed approach is better than both the pure generative and pure discriminative models under different sizes of training sets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call