Abstract

This paper proposes a Cascaded Random Forest (CRF) method, which can improve the classification performance by means of combining two different enhancements into the Random Forest (RF) algorithm. In detail, on the one hand, a neighborhood rough sets based Hierarchical Random Subspace Method is designed for feature selection, which can improve the strength of base classifiers and increase the diversity between each two of the base classifiers; and on the other hand, Boosting is introduced into RF. As the minimization of the training error for updating the weights of samples in Boosting often leads to overfitting, we added out-of-bag error to update the sample weights in CRF. Different from the existing Boosting strategy that only one base classifier is generated at each iteration, the proposed CRF trains several base classifiers at each iteration. To evaluate the performance of our method, CRF is compared with other related RF methods and support vector machine on three benchmark hyperspectral datasets, and experimental results show that CRF can provide competitive solutions for hyperspectral image classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.