Abstract

Since big data becomes a main impetus to the next generation of IT industry, data privacy has received considerable attention in recent years. To deal with the privacy challenges, differential privacy has been widely discussed and related private mechanisms are proposed as privacy-enhancing techniques. However, with today’s differential privacy techniques, it is difficult to generate a sanitized dataset that can suit every machine learning task. In order to adapt to various tasks and budgets, different kinds of privacy mechanisms have to be implemented, which inevitably incur enormous costs for computation and interaction. To this end, in this article, we propose two novel schemes for outsourcing differential privacy. The first scheme efficiently achieves outsourcing differential privacy by using our preprocessing method and secure building blocks. To support the queries from multiple evaluators, we give the second scheme that employs a trusted execution environment to aggregately implement privacy mechanisms on multiple queries. During data publishing, our proposed schemes allow providers to go off-line after uploading their datasets, so that they achieve a low communication cost which is one of the critical requirements for a practical system. Finally, we report an experimental evaluation on UCI datasets, which confirms the effectiveness of our schemes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call