Abstract

Deep Learning Models such as Convolution Neural Networks (CNNs) have shown great potential in various applications. However, these techniques will face regulatory compliance challenges related to privacy of user data, especially when they are deployed as a service on a cloud platform. Such concerns can be mitigated by using privacy preserving machine learning techniques. The purpose of our work is to explore a class of privacy preserving machine learning technique called Fully Homomorphic Encryption in enabling CNN inference on encrypted real-world dataset. Fully homomorphic encryption face the limitation of computational depth. They are also resource intensive operations. We run our experiments on MNIST dataset to understand the challenges and identify the optimization techniques. We used these insights to achieve the end goal of enabling encrypted inference for binary classification on melanoma dataset using Cheon-Kim-Kim-Song (CKKS) encryption scheme available in the open-source HElib library.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.