Abstract

In Federated Learning (FL), a model is trained by monitoring and managing a server by a group of clients so that the data trained in this way is stored in a decentralized manner. Therefore, each client uses its private data to train its local model. Then each client uploads their updated model to the server to aggregate and build the final global model. Finally, the server sends the finalized global model to the local clients after reaching the expected threshold/accuracy to use this model for their own purposes. In fact, FL aims to preserve the privacy of training data generated by local clients and processed locally and only periodically provides local model updates to the server. This paper provides a comprehensive overview of FL with an emphasis on combining FL with other technologies and techniques. Since FL is vulnerable to several attacks, researchers combined the FL concept with approaches such as homomorphic encryption and blockchain. Here, we divided the existing solutions into the categories of FL, the fusion of blockchain and FL, the fusion of FL and homomorphic encryption, and the fusion of FL, homomorphic encryption, and blockchain. In addition, for all the compared approaches, their advantages and disadvantages were explained, and based on these, open problems were highlighted.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call