Abstract

Data owners’ outsourced data on cloud data storage servers by the deduplication technique can reduce not only their own storage cost but also cloud’s. This paradigm also introduces new security issues such as the potential threat of data lost or corrupted. Data integrity verification is utilized to safeguard these data integrity. However, the cloud deduplication storage only focuses on file/chunk level to store one copy of the same data hosted by different data owners, and is not concerned with the same part of different data, e.g., a series of version files. We propose an integrity verification algorithm of different version files. The algorithm establishes the generic storage model of different version control methods to improve the universality of data verification. Then, the methods of verification tags and proofs generating are improved based on the index pointers corresponding to the storage relationship in the version groups and chained keys. Finally, the random diffusion extraction based on the random data sampling in the version group is proposed to improve the verification efficiency. The results of theoretical and experimental analysis indicate that the algorithm can achieve fast and large-scale verification for different version data.

Highlights

  • With the rapid development of the cloud computing, cloud storage as a new generation of computing infrastructure has received more and more attention

  • 1.2 Contributions To improve the efficiency and universality of data verification on the cloud deduplication storage, in this paper, we propose a verification algorithm of different version files (VDVF) that can verify the integrity of version data in remote storage while protecting users’ privacy

  • Considering that the data service provider (DSP) may not recognize that the verified data are corrupted only if the user declares, we introduce the third party verifier (TPV) to verify the integrity of these data

Read more

Summary

Introduction

With the rapid development of the cloud computing, cloud storage as a new generation of computing infrastructure has received more and more attention. 1.2 Contributions To improve the efficiency and universality of data verification on the cloud deduplication storage, in this paper, we propose a verification algorithm of different version files (VDVF) that can verify the integrity of version data in remote storage while protecting users’ privacy. Liu et al [13] proposed a message-locked integrity auditing scheme based on proxy re-signature techniques in encrypted cloud deduplication storage without an additional proxy server. Their algorithm is only applicable to the deduplicated storage which stores one copy of the same data hosted by different data owners. We improve the verification of different version files to reduce the impact of these problems on the integrity of version files

System model and problem statements
Constructing the generic storage model of version files
Algorithm design
Security of data storage
Data privacy-protecting
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.