Abstract

Artificial neural networks (ANNs) are a kind of well-known machine learning techniques, and it is required to adjust the weights of their neurons to learn a given task, which usually done by using a gradient-based optimization algorithm. However, gradient-based optimization algorithms likely get stuck in a local optimum, and therefore, researchers have attempted to apply population-based metaheuristics. In this paper, we study the performance comparison of various crossover operators in differential evolution (DE) for training ANNs. We investigated the classification performance of three crossover operators, the binomial crossover, the exponential crossover, and the multiple exponential recombination (MER), with medical datasets. The experimental results show that the binomial crossover and the MER have better performance compared with the exponential crossover, and the exponential crossover varies significantly in performance depending on the architecture. Also, we found that dependent variables in training ANNs may not be located proximately each other, which results in makes the advantage of the exponential crossover and the MER effectless.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.