Abstract

With the start of Industry 4.0 in 2011, new concepts and technologies have entered the IT literature. Some of these technologies are virtualization, modularity, big data and deduplication. Big data can be defined as data of a magnitude that exceeds the ability of traditional database systems to collect, store, manage and analyze data. Today, data is diverse, large and rapidly changing. This situation cannot be solved with the traditional database structure. With the emergence of big data, it has become difficult to process data with the algorithms used for data processing. Therefore, new algorithms and technologies have been developed. The most important of these technologies is data deduplication. Deduplication backs up data by dividing it into variable or fixed sizes. In this way, it aims to save storage space by storing only one copy of many repeated data. Today, "deduplication and compression" is an indispensable feature for data storage in both server-storge and hyper-converged architecture systems. Recently, artificial intelligence technologies are advancing very rapidly and their application areas are expanding. Therefore, Artificial Intelligence is a technology that will be very important for the industry and our lives in the future. The purpose of this paper is to give an idea about the relationship between deduplication technology and artificial intelligence by examining various deduplication systems and algorithms. Studies in the literature show that deduplication provides significant savings in storage space, the importance of data security, and the use of artificial intelligence and deduplication as a whole.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call