Abstract

Innovations are necessary to ride the inevitable tide of change. Most of enterprises are striving to reduce their computing cost through the means of virtualization. This demand of reducing the computing cost has led to the innovation of Cloud Computing. One fundamental aspect of this new computing is that data is being centralized or outsourced into the cloud. From the data owners perspective, including both individuals and IT enterprises, storing data remotely in a cloud in a flexible on-demand manner brings appealing benefits: relief of the burden of storage management, universal data access with independent geographical locations, and avoidance of capital expenditure on hardware, software, personnel maintenance, and so on although the infrastructures under the cloud are much more powerful and reliable than personal computing devices, they still face a broad range of both internal and external threats to data integrity. Outsourcing data into the cloud is economically attractive for the cost and complexity of long-term large scale data storage, it does not offer any guarantee on data integrity and availability. We propose a distributed scheme to ensure users that their data are indeed stored appropriately and kept intact all the time in the cloud. We are using erasure correcting code in the file distribution preparation to provide redundancies. We are relaying on challenge response protocol along with pre-computed tokens to verify the storage correctness of user's data & to effectively locate the malfunctioning server when data corruption has been detected. Our scheme maintains the same level of storage correctness assurance even if users modify, delete or append their data files in the cloud. Keywords - Cloud computing, Distributed data storage, Data security, Pervasive Computing, Virtualization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call