Abstract

In recent years, multi-objective evolutionary optimization algorithms have shown success in different areas of research. Due to their efficiency and power, many researchers have concentrated on adapting evolutionary algorithms to generate Pareto solutions. This paper proposes a new memetic adaptive multi-objective evolutionary algorithm that is based on a three-term backpropagation network (MAMOT). This algorithm is an automatic search method for optimizing the parameters and performance of neural networks, and it relies on the use of the adaptive non-dominated sorting genetic algorithm-II integrated with the backpropagation algorithm, being used as a local search method. The presented MAMOT employs a self-adaptive mechanism toward improving the performance of the proposed algorithm and a local optimizer improving all the individuals in a population in order to obtain better accuracy and connection weights. In addition, it selects an appropriate number of hidden nodes simultaneously. The proposed method was applied to 11 datasets representing pattern classification problems, including two-class, multi-class and complex data reflecting real problems. Experiments were performed, and the results indicated that the proposed method is viable in pattern classification tasks compared to a multi-objective genetic algorithm based on a three-term backpropagation network (MOGAT) and some of the methods mentioned in the literature. The statistical analysis results of the t test and Wilcoxon signed-ranks test also show that the performance of the proposed method is significantly better than MOGAT.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call