Abstract
Imbalanced classification tasks are widespread in many real-world applications. For such classification tasks, in comparison with the accuracy rate (AR), it is usually much more appropriate to use nondecomposable performance measures such as the area under the receiver operating characteristic curve (AUC) and the Fβ measure as the classification criterion since the label class is imbalanced. On the other hand, the minimax probability machine is a popular method for binary classification problems and aims at learning a linear classifier by maximizing the AR, which makes it unsuitable to deal with imbalanced classification tasks. The purpose of this article is to develop a new minimax probability machine for the Fβ measure, called minimax probability machine for the Fβ -measures (MPMF), which can be used to deal with imbalanced classification tasks. A brief discussion is also given on how to extend the MPMF model for several other nondecomposable performance measures listed in the article. To solve the MPMF model effectively, we derive its equivalent form which can then be solved by an alternating descent method to learn a linear classifier. Further, the kernel trick is employed to derive a nonlinear MPMF model to learn a nonlinear classifier. Several experiments on real-world benchmark datasets demonstrate the effectiveness of our new model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Neural Networks and Learning Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.