Abstract

AbstractAs an effective optimization technique that automatically tunes the architecture and hyperparameters of deep neural network models, neural architecture search (NAS) has made significant progress in deep learning model design and automated machine learning (AutoML). However, with the widespread attention to privacy issues, privacy-preserving machine learning approaches have received much attention. Federated learning (FL) is a machine learning paradigm that addresses data privacy issues, mainly facing heterogeneous and distributed scenarios. Therefore, combining FL with NAS can effectively address the privacy issues faced in NAS. Several studies have proposed federated neural architecture search methods, which provide a feasible solution for the joint construction of deep learning models with optimal performance for multiple parties without data sharing. Federated neural architecture search focuses on solving the design challenges of deep neural network models for distributed data and making the models more suitable for heterogeneous scenarios. In this paper, a summary of research related to neural architecture search and FL. We give a review of current work in federated neural architecture search and summarize open issues of existing research. The objective is to offer an overview of a survey on combining FL with NAS, balancing the privacy protection issues of deep neural network models with efficient design. Privacy protection is the primary goal of federated neural architecture search while ensuring model performance.KeywordsFederated learningNeural architecture searchFederated NASSecurity and privacy

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call