Abstract
In Cyber-Physical Systems (CPS), distributed learning is essential for efficiently handling complex tasks when sufficient resources are available. However, when resources are limited, traditional distributed learning struggles to complete even simple tasks and presents a risk of privacy leakage. As a promising distributed learning paradigm, federated learning only requires the client to send the trained model to the server instead of private data, thereby preserving the client’s privacy to some extent. However, with the rapid development of artificial intelligence technology, attack methods such as inference attacks still cause privacy leakage for clients participating in federated learning. Moreover, due to its distributed learning nature, federated learning cannot escape the dilemma of model accuracy being constrained by resources. To address the aforementioned problems, this paper proposes a Federated local differential privacy scheme using Model Parameter Selection, named Fed-MPS, for resource-constrained CPS. Specifically, to resolve the issue of limited CPS resources, Fed-MPS adopts an update direction consistency-based parameter selection algorithm in federated learning to extract parameters that enhance model accuracy for subsequent training, thereby improving the final model accuracy and reducing communication overhead. Furthermore, Fed-MPS applies the local differential privacy mechanism to further enhance clients’ privacy. By adding noise only to the chosen parameters, the privacy budget is significantly reduced while ensuring model accuracy. Through privacy analysis, we prove that the proposed Fed-MPS scheme satisfies (ϵ,δ)−DP. Additionally, convergence analysis guarantees that Fed-MPS will converge to the global optimum with a convergence ratio of O(1T2) within T rounds of federated learning. Extensive experiments on prominent benchmark datasets Cifar10, Mnist, and FashionMNIST demonstrate that, compared with baseline schemes, the proposed Fed-MPS provides higher model accuracy for CPS under resource constraints.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.