Abstract

Federated Learning (FL) could provide a promising privacy-preserving intelligent learning paradigm for Space-Air-Ground Integrated Internet of Things (SAGI-IoT) by breaking down data islands and solving the dilemma between data privacy and data sharing. Currently, adaptivity, communication efficiency and model security are the three main challenges faced by FL, and they are rarely considered by existing works simultaneously. Concretely, most existing FL works assume that local models share the same architecture with global model, which is less adaptive and cannot meet the heterogeneous requirements of SAGI-IoT. Exchanging numerous model parameters not only generates massive communication overhead but also poses the risk of privacy leakage. The security of FL based on homomorphic encryption with single private key is weak as well. Given this, this paper proposes a tensor-empowered communication-efficient and trustworthy heterogeneous FL, where various participants could choose suitable heterogeneous local models according to their actual computing and communication environment, so that clients with different capabilities could do what they are good at. Additionally, tensor train decomposition is leveraged to reduce communication parameters while maintaining model performance. The storage requirements and communication overhead for heterogeneous clients are reduced further. Finally, the homomorphic encryption with double trapdoor property is utilized to provide a robust and trustworthy environment, which can defend against the inference attacks from malicious external attackers, honest-but-curious server and internal participating clients. Extensive experimental results show that the proposed approach is more adaptive and can improve communication efficiency as well as protect model security compared with the state-of-the-art.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call