Abstract

Multi-view learning aims to explore a global common structure shared by different views collected from multiple individual sources. The nascent field of federated learning tries to learn a global model over distributed networks of devices. This paper shows that multi-view learning is naturally suited to address the feature heterogeneity of the federated setting. We propose a novel model, namely robust federated multi-view learning (FedMVL), which is considered in the following formulation: given a dataset with M views, it is required to train machine learning models while the M views are distributed across M devices or nodes. Considering the unique challenges like stragglers and fault tolerance in federated setting, we derive an iterative federated optimization algorithm that allows each node with the flexibility to approximately address its subproblem. To the best of our knowledge, our model for the first time considers the issues including high communication cost, fault tolerance, and stragglers for distributed multi-view learning. The proposed model also achieves encouraging performance on clustering task compared to closely related methods, as we illustrate through simulations on several real-world datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.