Abstract

AbstractThe random subspace method (RSM) is one of the ensemble learning algorithms widely used in pattern classification applications. RSM has the advantages of small error rate and improved noise insensitivity due to ensemble construction of the base‐learners. However, randomness may cause a reduction of the final ensemble decision performance because of contributions of classifiers trained by subsets with low class separability. In this study, we present a new and improved version of the RSM by introducing a weighting factor into the combination phase. One of the class separability criteria, J3, is used as a weighting factor to improve the classification performance and eliminate the drawbacks of the standard RSM algorithm. The randomly selected subsets are quantified by computing their J3 measure to determine voting weights in the model combination phase, assigning lower voting weight to classifiers trained by subsets with poor class separability. Two models are presented including J3‐weighted RSM and optimized J3 weighted RSM. In J3 weighted RSM, computed weighting values are directly multiplied by class assignment posteriors, whereas in optimized J3 weighted RSM, computed weighting values are optimized by a pattern search algorithm before multiplying by posteriors. Both models are shown to provide better error rates at lower subset dimensionality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call