Abstract

Now a days use of cloud computing is increasing rapidly. Cloud computing is very important in the data sharing application. Daily use of cloud is increasing. But the problem in cloud computing is every day data uploaded on the cloud, so increasing similar data in cloud. Therefore it can be reduce the size of similar data in cloud using the data Deduplication method. These method main aim is that remove duplicate data from cloud. It can also help to save storage space and bandwidth. This proposed method is to remove the duplicate data but in which user have assigned some privilege according to that duplication check & each user have their unique token. Cloud Deduplication is achieve using the hybrid cloud architecture. This proposed method is more secure and consumes less resources of cloud. Also it shown that proposed scheme has minimal overhead in duplicate removal as compared to the normal Deduplication technique. In this paper Content Level Deduplication as well as File Level Deduplication of file data is checked over the cloud.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call