Abstract

Cloud-edge collaborative learning has received considerable attention recently, which is an emerging distributed machine learning (ML) architecture for improving the performance of model training among cloud center and edge nodes. However, existing cloud-edge collaborative learning schemes cannot efficiently train high-performance models on large-scale sparse samples, and have the potential risk of revealing the privacy of sensitive data. In this paper, adopting homomorphic encryption (HE) cryptographic technique, we present a privacy-preserving cloud-edge collaborative learning over vertically partitioned data, which allows cloud center and edge node to securely train a shared model without a third-party coordinator, and thus greatly reduces the system complexity. Furthermore, the proposed scheme adopts the batching technique and single instruction multiple data (SIMD) to achieve parallel processing. Finally, the evaluation results show that the proposed scheme improves the model performance and reduces the training time compared with the existing methods; the security analysis indicates that our scheme can guarantee the security in semi-honest model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call