Abstract
The performance of the ensemble algorithms is related with the individual accuracy of the base learners and their results diversity. Individual accuracy of a base learner is directly related to the similarity between the original training set and the base learner's training set. When a modified training set by randomly selecting features/classes/samples is given to the base learners, the diversity is created but the individual accuracy is decreased. From this point of view, different ensemble algorithms can be seen as a selection between having more accurate but less diverse base learners and having more diverse but less accurate base learners. We propose a meta ensemble method named as improved space forest which adds generated and (hopefully) more accurate features to the original features. The new features are obtained from randomly selected original features. When the new features are more distinctive than the original ones, they are selected by the learners. So, the ensemble may have more accurate base learners. However, a different improved space is generated for each learner to create diversity. The proposed method can be used with different ensemble methods. We compared original and improved space versions of bagging, random forest, and rotation forest algorithms. Improved space versions have generally better or comparable results than the original ones. We also present a theoretical framework to analyze the individual accuracies and diversities of the base learners.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.