Abstract

AbstractThe artificial intelligence (AI) within the last decade has experienced a rapid development and has attained power to simulate human‐thinking in various situations. When the deep neural networks (DNNs) are trained with huge dataset and high computational resources it can bring out great outcomes. But the learning process of DNN is very much complicated and time‐consuming. In various circumstances, where there is a data‐scarcity, the algorithms are not capable of learning tasks at a faster rate and perform nearer to that of human intelligence. With advancements in deep meta‐learning in several research studies, this problem has been dealt. Meta‐learning has outspread range of applications where the meta‐data (data about data) of the either tasks, data or the models which were previously trained can be employed to optimise the learning. So in order to get an insight of all existing meta‐learning approaches for DNN model optimisation, the authors performed survey introducing different meta‐learning techniques and also the current optimisation‐based approaches, their merits and open challenges. In this research, the Reptile meta‐learning algorithm was chosen for the experiment. As Reptile uses first‐order derivatives during optimisation process, hence making it feasible to solve optimisation problems. The authors achieved a 5% increase in accuracy with the proposed version of Reptile meta‐learning algorithm.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.