Abstract

While big data becomes a main impetus to the next generation of IT industry, big data privacy, as an unevadable topic in big data era, has received considerable attention in recent years. To deal with the privacy challenges, differential privacy has been widely discussed as one of the most popular privacy-enhancing techniques. However, with today’s differential privacy techniques, it is impossible to generate a sanitized dataset that can suit different algorithms or applications regardless of the privacy budget. In other words, in order to adapt to various applications and privacy budgets, different kinds of noises have to be added, which inevitably incur enormous costs for both communication and storage. To address the above challenges, in this paper, we propose a novel scheme for outsourcing differential privacy in cloud computing, where an additive homomorphic encryption (e.g., Paillier encryption) is employed to compute noise for differential privacy by cloud servers to boost efficiency. The proposed scheme allows data providers to outsource their dataset sanitization procedure to cloud service providers with a low communication cost. In addition, the data providers can go offline after uploading their datasets and noise parameters, which is one of the critical requirements for a practical system. We present a detailed theoretical analysis of our proposed scheme, including proofs of differential privacy and security. Moreover, we also report an experimental evaluation on real UCI datasets, which confirms the effectiveness of the proposed scheme.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call