Abstract

This paper is a comprehensive study on nature inspired Hyperparameter optimization, with a distinct focus on Honey Badger Algorithm, along with Aquila Optimizer algorithm. The study involves in-depth analysis of the above algorithms, their weaknesses and strengths and comparing them with the theoretical advantages. The implementation of these algorithms, This paper demonstrate the promise of these algorithms on optimization of Hyperparameters like learning rate, number of hidden layers for our various datasets. The findings of this paper show that HBA and Aquila Optimization algorithms offer potential alternatives to the existing approaches, providing more effective and efficient solutions for hyperparameter optimization. This paper contributes to ongoing discourse on the place of nature inspired algorithms and their place in solutions to unconventional places

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call