Abstract

This paper presents a combination strategy of multiple individual routing classifiers to improve classification accuracy in natural language call routing applications. Since errors of individual classifiers in the ensemble should somehow be uncorrelated, we propose a combination strategy where the combined classifier accuracy is a function of the accuracy of individual classifiers and also the correlation between their classification errors. We show theoretically and empirically that our combination strategy, named the constrained minimization technique, has a good potential in improving the classification accuracy of single classifiers. We also show how discriminative training, more specifically the generalized probabilistic descent (GPD) algorithm, can be of benefit to further boost the performance of routing classifiers. The GPD algorithm has the potential to consider both positive and negative examples during training to minimize the classification error and increase the score separation of the correct from competing hypotheses. Some parameters become negative when using the GPD algorithm, resulting from suppressive learning not traditionally possible; important antifeatures are thus obtained. Experimental evaluation is carried on a banking call routing task and on switchboard databases with a set of 23 and 67 destinations, respectively. Results show either the GPD or constrained minimization technique outperform the accuracy of baseline classifiers by 44% when applied separately. When the constrained minimization technique is added on top of GPD, we show an additional 15% reduction in the classification error rate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call