Abstract

With the rise of big data in cloud computing, many optimization problems have gradually developed into high-dimensional large-scale optimization problems. In order to address the problem of dimensionality in optimization for genetic algorithms, an adaptive dimensionality reduction genetic optimization algorithm (ADRGA) is proposed. An adaptive vector angle factor is introduced in the algorithm. When the angle of an individual’s adjacent dimension is less than the angle factor, the value of the smaller dimension is marked as 0. Then, the angle between each individual dimension is calculated separately, and the number of zeros in the population is updated. When the number of zeros of all individuals in a population exceeds a given constant in a certain dimension, the dimension is considered to have no more information and deleted. Eight high-dimensional test functions are used to verify the proposed adaptive dimensionality reduction genetic optimization algorithm. The experimental results show that the convergence, accuracy, and speed of the proposed algorithm are better than those of the standard genetic algorithm (GA), the hybrid genetic and simulated annealing algorithm (HGSA), and the adaptive genetic algorithm (AGA).

Highlights

  • High-dimensional optimization problems have become increasingly prevalent in many fields, for example, radar waveform optimization [1, 2] and water quality monitoring [3]

  • In 2015, Chen et al [6] designed a congestion control strategy based on the concept of open angles and compared it against the indication-based evolutionary algorithm (IBEA) [7], NSGA III [8], and the gridbased evolutionary algorithm (GrEA) [9]. e results indicated a significant improvement by the proposed algorithm

  • Adaptive dimensionality reduction is feasible following the principle mentioned above. e pseudocode for adaptive dimensionality reduction genetic optimization algorithm (ADRGA) is shown in Algorithm 1 (Pc is the probability of crossover, Pm is the probability of mutation, N is the swarm size, G is the number of generations upon termination of evolution, T is the number of tests, Q is the critical value)

Read more

Summary

Introduction

High-dimensional optimization problems have become increasingly prevalent in many fields, for example, radar waveform optimization [1, 2] and water quality monitoring [3]. Zheng et al [10] proposed a highdimensional multiobjective evolutionary algorithm based on information separation This algorithm decomposed the high-dimensional space into a low-dimensional one, it did not remove the excess dimensions. Xu et al [15] proposed a multidimensional learning strategy based on the experience of the best individuals, which was used to discover and integrate valuable information from the best swarm solution In their experimentation, 16 classical benchmark functions, 30 CEC 2014 testing functions, and one actual optimization problem were used. Is algorithm, only improved the classification accuracy as compared with the standard genetic algorithms to a limited extent He and Yen [17] used the minimum included angle between two vectors to look for similar individuals, among which the individuals with poor convergence were deleted. It provides a reference for U-model approaches in algorithm implementation

The Proposed ADRGA Algorithm
Experimental Results
F2 F3 F4 F5 F6 F7 F8
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call