Abstract
AdaBoosting is widely used in neural network ensemble as a variety of boosting algorithm. However, with the learning pattern of focusing on hard sample, AdaBooting makes neural network fall into degradation easily. Additionally, in neural network ensemble the weight of individual neural network only takes into account the misclassifying rate. In fact, due to the characteristic of neural network's tendency of learning hard samples in Adaboosting algorithm, the predictive accuracy of individual neural network's training for a particular sample space is much higher than for the whole sample space. This paper proposes an improved AdaBoosting algorithm called Cloud-AdaBoosting. In this method, it is introduced the concept of cloud model and applied the technique of filling training set with similar samples generated by cloud generator to overcome the degradation phenomena resulted from the tendency of learning hard samples. Through calculating the certainty degree of each test sample relative to neural network, it is adjusted dynamically the weight of whole neural network output by individual neural network which can reflect the prediction ability of a part of samples in neural network ensemble, and then enhance the whole prediction performance of the ensemble. The experiment results show that the proposed algorithm is efficient to increase the prediction accuracy of neural network ensemble.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.