Multiparty learning (MPL) is an emerging framework for privacy-preserving collaborative learning. It enables individual devices to build a knowledge-shared model and remaining sensitive data locally. However, with the continuous increase of users, the heterogeneity gap between data and equipment becomes wider, which leads to the problem of model heterogeneous. In this article, we concentrate on two practical issues: data heterogeneous problem and model heterogeneous problem, and propose a novel personal MPL method named device-performance-driven heterogeneous MPL (HMPL). First, facing the data heterogeneous problem, we focus on the problem of various devices holding arbitrary data sizes. We introduce a heterogeneous feature-map integration method to adaptively unify the various feature maps. Meanwhile, to handle the model heterogeneous problem, as it is essential to customize models for adapting to the various computing performances, we propose a layer-wise model generation and aggregation strategy. The method can generate customized models based on the device's performance. In the aggregation process, the shared model parameters are updated through the rules that the network layers with the same semantics are aggregated with each other. Extensive experiments are conducted on four popular datasets, and the result demonstrates that our proposed framework outperforms the state of the art (SOTA).
Read full abstract