Abstract
The introduction of the ReLU function in neural network architectures yielded substantial improvements over sigmoidal activation functions and allowed for the training of deep networks. Ever since, the search for new activation functions in neural networks has been an active research topic. However, to the best of our knowledge, the design of new activation functions has mostly been done by hand. In this work, we propose the use of a self-adaptive evolutionary algorithm that searches for new activation functions using a genetic programming approach, and we compare the performance of the obtained activation functions to ReLU. We also analyze the shape of the obtained activations to see if they have any common traits such as monotonicity or piece-wise linearity, and we study the effects of the self-adaptation to see which operators perform well in the context of a search for new activation functions. We perform a thorough experimental study on datasets of different sizes and types, using different types of neural network architectures. We report favorable results obtained from the mean and standard deviation of the performance metrics over multiple runs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.