Abstract

Recently, many researchers have designed neural network architectures with evolutionary algorithms but most of them have used only the fittest solution of the last generation. To better exploit information, an ensemble of individuals is a more promising choice because information that is derived from combining a set of classifiers might produce higher accuracy than merely using the information from the best classifier among them. One of the major factors for optimum accuracy is the diversity of the classifier set. In this paper, we present a method of generating diverse evolutionary neural networks through fitness sharing and then combining these networks by the behavior knowledge space method. Fitness sharing that shares resources if the distance between the individuals is smaller than the sharing radius is a representative speciation method, which produces diverse results than standard evolutionary algorithms that converge to only one solution. Especially, the proposed method calculates the distance between the individuals using average output, Pearson correlation and modified Kullback–Leibler entropy to enhance fitness sharing performance. In experiments with Australian credit card assessment, breast cancer, and diabetes in the UCI database, the proposed method performed better than not only the non-speciation method but also better than previously published methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.