Abstract

Integrating Deep Learning (DL) techniques in Convolutional Neural Networks (CNNs) with encrypted data analysis is an emerging field for enhancing data privacy and security. A significant challenge in this domain is the incompatibility of standard non-linear Activation Functions (AF) like Rectified Linear Unit (ReLU) and Hyperbolic Tangent (tanh) with Zero-Knowledge (ZK) encrypted data, which impacts computational efficiency and data privacy. Addressing this, our paper introduces the novel application of Chebyshev Polynomial Approximation (CPA) to adapt these AF to process encrypted data effectively. Utilizing the MNIST dataset, this paper conducted experiments with LeNet and various configurations of AlexNet, extending the range of the ReLU and tanH functions to optimize CPA. Our results reveal an optimal polynomial degree (α), with α = 10 for ReLU and between α = 10 and α = 15 for tanH, beyond which the benefits in accuracy plateau. This finding is crucial for ensuring the accuracy and efficiency of CNNs in processing encrypted data. This recommended study demonstrates that while the accuracy slightly decreases for plaintext data and more significantly for ciphertext data, the overall effectiveness of CPA in CNNs is maintained. This advancement enables CNNs to process encrypted data while preserving privacy and marks a significant step in developing privacy-preserving Machine Learning (ML) and encrypted data analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call