Abstract
Storing extensive data in cloud environments affects service quality, transmission speed, and access to information in systems, which is becoming a growing challenge. In storage improvement, reducing various costs and reducing the shortest path in the storage of distributed cloud data centers are among the important issues in the field of cloud computing. In this paper, particle swarm optimization (PSO) algorithm and learning automaton (LA) are used to minimize the cost of a data center, which includes communication, data transfer, and storage and optimization of communication between data centers. To improve storage in distributed data centers, a new model called LAPSO is proposed by combining LA and PSO, in which LA improves particle control by searching for particle speed and position. In this method, LA moves each particle in the direction where it has the best individual and group experiences. In multipeak problems, it does not fall into local optimums. Results of the experiments are shown on the dataset of spatial information and cadastre of country lands, which includes 13 data centers. The proposed method evaluates and improves the optimal position parameters, minimum route cost, distance, data transfer cost, storage cost, data communication cost, load balance, and access performance better than other methods.
Highlights
With the increasing growth of cloud computing and the ability to use data storage centers and computing servers in cloud computing, nowadays, many applications require infrastructure to store and process data outside the client and systems
Optimization of Cloud Storage in terms of Distance. e main task of this section is to analyze and discuss the problem of cloud storage among 13 data centers according to the distance factor. e entire distance between adjacent data centers plays an important role in the transmission and efficiency of data centers. e proposed method (LAPSO) is compared with the particle swarm optimization (PSO) and genetic algorithms [14]. e parameters in the LAPSO algorithm and the cloud storage optimization problem should start at the beginning of the evolutionary process. e number of all particles in the LAPSO algorithm is set to 78, and the maximum number of generations is set to 100
Spatial information of Iranian lands was introduced according to the existing communication network and the number of available resources to solve the storage problem between LAPSO data centers. e method used to optimize cloud storage was to minimize storage costs
Summary
With the increasing growth of cloud computing and the ability to use data storage centers and computing servers in cloud computing, nowadays, many applications require infrastructure to store and process data outside the client and systems. In addition to optimizing the search model, the relationship between the parameters is considered Research in this field has generally discussed the issue of improving cloud storage in the distributed data center in order to minimize costs and the shortest route, since each resource has its own data center and stores the data in another data center to ensure safety and security. The multiobjective PSO algorithm calculates the best particles by minimizing the optimal position parameters, minimum route cost, distance, data transfer cost, storage cost, data communication cost, load balance, access, and year. (iii) To improve storage, this new LAPSO model is presented to improve performance in the PSO multipurpose algorithm, which is the PSO objective function It minimizes the cost of creating the data center as well as storing, transferring, and communicating between data centers. A step in the right direction, the complicated legal environment of numerous regulating bodies and legislations still point to per-jurisdiction data centers as an attractive alternative
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have