Abstract

There have been a lot of reports about the fact that the characteristics of datasets will strongly affect the performance of different classifiers. A study in the cognition is thus conceived, and it is natural to propose the Bayesian approach. As is well known, valuable quantitative features from datasets are easily captured and then to update these previous classification problems to guarantee well class separability. The purpose of this learning method is to give an attractive pragmatic feature of the Bayesian approach in the quantitative description of class imbalance problem. Thus, a programming problem of mixing probability information: Bayesian Support Vector Machines (BSVMs) is discussed. In addition, we first change some of the aims and conditions of the original programming problems and then explore what effect will be acquired due to the change. The experiments on several existing datasets show that, if prior distributions are assigned to the programming problem, the estimated classification errors will be reduced.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call