Abstract
In this article, we present a new method of multiclass classification by combining multiple binary classifiers in the context of information transmission theory. In the framework of the error correcting output coding (ECOC), a misclassification of each binary classifier is formulated as a bit inversion with a probabilistic model. While the conventional Hamming decoding assumes the binary symmetric channel in an information transmission, the symmetric assumption is especially problematic in multiclass classification problems: for example, 1 vs R approach typically makes an asymmetric situation even if all classes contain the same number of examples. The asymmetry property corresponds to two kinds of error rate of the binary classification problem; the false positive error and the false negative error. We propose a probabilistic model which assumes an asymmetric channel having 3 inputs and 2 outputs. By the maximum likelihood estimation with the proposed probabilistic model, we can identify properties of the noisy channel according to performances of applied binary classifiers. A multiclass label and a class membership probability for an input are easily estimated by the model. Experimental studies using a synthetic dataset and datasets from UCI repository are performed and results show that the proposed method is superior to the Hamming decoding and comparative to other multiclass classification methods such as multiclass support vector machine
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.