Abstract

Deploying convolutional neural network (CNN) inference on resource-constrained devices remains a remarkable challenge for industrial Internet of Things (IIoT). Although the cloud computing shows great promise in machine learning training and prediction, outsourcing data to remote cloud always incurs privacy risk and high latency. Therefore, we design a new framework for efficient and privacy-preserving CNN inference based on cloud-edge-client collaboration <formula><tex>$(\rm{named} \, \text{PCNN}_{\text{CEC}})$</tex></formula>. In <formula><tex>$ \text{PCNN}_{\text{CEC}} $</tex></formula>, the model of cloud and the data of client in IIoT are split into two shares and sent to two non-colluded edge servers. By applying the arithmetic secret sharing and pre-computation of beaver's triplets, the two edge servers can jointly calculate the predicting results without learning anything about the model and data. To speed up the pre-computation of offline phase and not sacrifice security, the task of triplets generation is delegated to the cloud, so that the edge servers do not require frequent interactions to generate triplets themselves or introducing additional trusted party. The experimental results show the proposed private comparison protocol achieves a better tradeoff between low latency and high throughput, when it is compared with garbled circuit based protocols and other secret sharing based protocols. Additionally, the benchmarks conducted on realistic MNIST and CIFAR-10 datasets demonstrate that <formula><tex>$ \text{PCNN}_{\text{CEC}} $</tex></formula> costs less communication and runtime than two recently related schemes under the same security level.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call