Abstract

The advances in machine learning have revealed its great potential for emerging mobile applications such as face recognition and voice assistant. Models trained via a Neural Network (NN) can offer accurate and efficient inference services for mobile users. Unfortunately, the current deployment of such service encounters privacy concerns. Directly offloading the model to the mobile device violates model privacy of the model owner, while feeding user input to the service compromises user privacy. To address this issue, we propose Leia, a lightweight cryptographic NN inference system at the edge. Leia is designed from two mobile-friendly perspectives. First, it leverages the paradigm of edge computing wherein the inference procedure keeps the model closer to the mobile user to foster low latency service. Specifically, Leia’s architecture consists of two non-colluding edge services to obliviously perform NN inference on the encoded user data and model. Second, Leia’s realization makes the judicious use of potentially constrained computational and communication resources in edge devices. We adapt the Binarized Neural Network (BNN), a trending flavor of NN with low inference overhead, and purely choose the lightweight secret sharing techniques to realize secure blocks of BNN. We implement Leia and deploy it on Raspberry Pi. Empirical evaluations on benchmark and medical datasets via various models demonstrate the practicality of Leia.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call