Abstract

Data island effectively blocks the practical application of machine learning. To meet this challenge, a new framework known as federated learning was created. It allows model training on a large amount of scattered data owned by different data providers. This article presents a parallel solution for computing logistic regression based on distributed asynchronous task framework. Compared to the existing work, our proposed solution does not rely on any third-party coordinator, and hence has better security and can solve the multitraining problem. The logistic regression based on homomorphic encryption is implemented in Python, which is used for vertical federated learning and prediction of the resulting model. We evaluate the proposed solution using the MNIST dataset, and the experimental results show that good performance is achieved.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call