Abstract

With the ability to analyze data, artificial intelligence technology and its offshoots have made difficult tasks easier. The tools of these technologies are now used in almost every aspect of life. For example, Machine Learning (ML), an offshoot of artificial intelligence, has become the focus of interest for researchers in industry, education, healthcare and other disciplines and has proven to be as efficient as, and in some cases better than, experts in answering various problems. However, the obstacles to ML’s progress are still being explored, and Federated Learning (FL) has been presented as a solution to the problems of privacy and confidentiality. In the FL approach, users do not disclose their data throughout the learning process, which improves privacy and security. In this article, we look at the security and privacy concepts of FL and the threats and attacks it faces. We also address the security measures used in FL aggregation procedures. In addition, we examine and discuss the use of homomorphic encryption to protect FL data exchange, as well as other security strategies. Finally, we discuss security and privacy concepts in FL and what additional improvements could be made in this context to increase the efficiency of FL algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.