Abstract

Designing optimisation algorithms that perform well in general requires experimentation on a range of diverse problems. Training neural networks is an optimisation task that has gained prominence with the recent successes of deep learning. Although evolutionary algorithms have been used for training neural networks, gradient descent variants are by far the most common choice with their trusted good performance on large-scale machine learning tasks. With this paper we contribute CORNN (Continuous Optimisation of Regression tasks using Neural Networks), a large suite for benchmarking the performance of any continuous black-box algorithm on neural network training problems. Using a range of regression problems and neural network architectures, problem instances with different dimensions and levels of difficulty can be created. We demonstrate the use of the CORNN Suite by comparing the performance of three evolutionary and swarm-based algorithms on over 300 problem instances, showing evidence of performance complementarity between the algorithms. As a baseline, the performance of the best population-based algorithm is benchmarked against a gradient-based approach. The CORNN suite is shared as a public web repository to facilitate easy integration with existing benchmarking platforms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call