Abstract

Although deep neural networks (DNNs) have been increasingly applied in industrial cyber physical systems (ICPSs), they are vulnerable to security attacks due to the tight interaction between cyber elements and physical elements. In this article, we aim to protect the core IP of DNNs, i.e., the model weights, against security attacks. Different from conventional approaches, a layerwise protection framework is proposed to ensure the confidentiality of DNN model weights during the inference procedure such that the security quality is maximized, while satisfying the latency constraint of the DNN task. Based on the layerwise execution characteristics of DNN tasks, the encrypted layer-related weights are decrypted and fed to the next layer of DNN in plaintext. CPU-field programmable gate array (FPGA) coscheduling is considered to accelerate the execution of confidentiality protection, where CPU is utilized to conduct the decryption of weights and FPGA is used to perform the layer execution of DNN. Considering to provide optimal confidential protection for each layer, the problem is transformed into a quality of security maximization problem subject to layerwise execution constraint and deadline constraint of the DNN application. Due to the problem being NP-hard, a fast approximation algorithm is proposed to obtain the near-optimal solution under given real-time and security constraints. Extensive experiments and a real-life ICPS application evaluate the efficiency of the proposed techniques.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.