Abstract

A privacy-preserving framework in which a computational resource provider receives encrypted data from a client and returns prediction results without decrypting the data, i.e., oblivious neural network or encrypted prediction, has been studied in machine learning that provides prediction services. In this work, we present MOBIUS (Model-Oblivious BInary neUral networkS), a new system that combines Binarized Neural Networks (BNNs) and secure computation based on secret sharing as tools for scalable and fast privacy-preserving machine learning. BNNs improve computational performance by binarizing values in training to $-1$ and $+1$, while secure computation based on secret sharing provides fast and various computations under encrypted forms via modulo operations with a short bit length. However, combining these tools is not trivial because their operations have different algebraic structures and the use of BNNs downgrades prediction accuracy in general. MOBIUS uses improved procedures of BNNs and secure computation that have compatible algebraic structures without downgrading prediction accuracy. We created an implementation of MOBIUS in C++ using the ABY library (NDSS 2015). We then conducted experiments using the MNIST dataset, and the results show that MOBIUS can return a prediction within 0.76 seconds, which is six times faster than SecureML (IEEE S\&P 2017). MOBIUS allows a client to request for encrypted prediction and allows a trainer to obliviously publish an encrypted model to a cloud provided by a computational resource provider, i.e., without revealing the original model itself to the provider.

Highlights

  • CONTRIBUTION In this work, we propose a new system named ModelOblivious BInarized neUral networkS (MOBIUS),2 which enables scalable encrypted prediction and the use of an encrypted model, i.e., making prediction oblivious about the model

  • BINARIZED NEURAL NETWORKS COMPATIBLE WITH SECURE COMPUTATION we describe our improved Binarized Neural Networks (BNNs) that will be suitable for applying secure computation protocols, and will be used in MOBIUS

  • As our main technical contribution, we presented new algorithms of BNNs that are compatible with secure computation by representing all parameters in integers and removing the bit-shift method used in the original BNNs [15]

Read more

Summary

INTRODUCTION

MOBIUS outperforms TAPAS [14] and FHE-DiNN [12], which are works on privacy-preserving prediction based on BNNs, in terms of both accuracy and computational performance despite solving the model-oblivious problem. 3) OUR APPROACH: IMPROVED BNNs Our idea is to modify the original BNN of [15] so that it would inherit the advantage on fast computation of BNN (due to binarized forms) while improve accuracy We execute this idea by using arithmetic shares for both layers and observing that we are not confined to multiply to only the power of 2, as is the case in the shift-based method for SBN, anymore. Note that the frameworks in [22], [23] are intended for general protocols such as divisions, while ours is for the mere purpose of applying to batch normalization Another approach for preserving privacy is differential privacy [24], which can prevent a trained model from leaking an individual record by perturbing the records with randomized noise.

PRELIMINARIES
BINARIZED NEURAL NETWORKS COMPATIBLE WITH SECURE COMPUTATION
BATCH NORMALIZATION WITH INTEGERS
IMPROVED BINARIZED NEURAL NETWORKS
MOBIUS DESIGN
SECRET SHARING A MODEL
DETERMINATION OF MODULUS SIZE
EXPERIMENT
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call