Abstract

Recently, privacy-preserving machine learning (PPML) has received a lot of research attention, due to the increasing demand for multiple data owners in training machine learning models. Thus, many works have been proposed for privacy-preserving machine learning, in which many breakthroughs have been made. However, However, there is still much room for improvement in terms of efficiency and accuracy. In this paper, we design a toolkit called LEGO that efficiently combines garbled circuits, secret sharing, oblivious transfer, and homomorphic encryption to reduce time and improve accuracy. Our work focuses on the two-server model where the two servers train the models on the different owners’ private data. For fully connected layers and convolutional layers, we utilize the associated multiplication triplets to reduce the communication. For the offline phase, we design a novel protocol based on oblivious transfer and improved the previous protocol based on homomorphic encryption. We use the MPC-friendly activation function to improve the performance. Besides, we accelerate the local computation by adding GPU support. Our work is implemented in C++, and it can securely train machine learning models, including linear regression, logistic regression, neural networks, and convolutional neural networks. The experimental results show that our work is significantly faster than the state-of-the-art works.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call