Abstract

Vertical Federated Learning (VFL) is a private-preserving distributed machine learning paradigm that collaboratively trains machine learning models with participants whose local data overlap largely in the sample space, but not so in the feature space. Existing VFL methods are mainly based on synchronous computation and homomorphic encryption (HE). Due to the differences in the communication and computation resources of the participants, straggling participants can cause delays during synchronous VFL model training, resulting in low computational efficiency. In addition, HE incurs high computation and communication costs. Moreover, it is difficult to establish a VFL coordinator (a.k.a. server) that all participants can trust. To address these problems, we propose an efficient Asynchronous Multi-participant Vertical Federated Learning method (AMVFL). AMVFL leverages asynchronous training which reduces waiting time. At the same time, secret sharing is used instead of HE for privacy protection, which further reduces the computational cost. In addition, AMVFL does not require a trusted entity to serve as the VFL coordinator. Experimental results based on real-world and synthetic datasets demonstrate that AMVFL can significantly reduce computational cost and improve the accuracy of the model compared to five state-of-the-art VFL methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.