Abstract

Vertical federated learning (VFL) enables collaborative machine learning on vertically partitioned data with privacy-preservation, attracting widespread attentions from academia and industry. Most existing VFL methods face two daunting challenges in real-world applications. First, most VFL methods assume at least one party holds the complete set of labels of all data samples. However, this assumption often violates the nature of many scenarios, where the parties only have partial labels. Second, the limitation of computational and communication resources in participated parties may cause the straggler problem and slow down training convergence. To address these challenges, we propose a novel VFL algorithm named Cascade Vertical Federated Learning (CVFL), in which partitioned labels can be fully utilized to train neural networks. To mitigate the straggler problem, we design a novel optimization objective to increase straggler's contribution to the trained models. We conduct comprehensive experiments and the results demonstrate the effectiveness and efficiency of CVFL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call