Abstract
Modern studies have shown that the choice of activation function can significantly affect the performance of several learning methods and deep networks. The activation function plays an important role in solving the nonlinear problem, and various nonlinear fully connected activation functions have been studied. In this paper, we propose a combined parametric activation function that can improve the performance of a fully connected Artificial intelligence methods and neural network. Combined parametric activation functions can be created by simply adding parametric activation functions. The parametric activation function is a function that can be optimized in the direction of minimizing the loss function by applying a appropriate parameter that converts the scale and location of the activation function according to the input data. The development of Artificial Neural Networks (ANNs) has achieved a lot of fruitful results so far, and we know that activation function is one of the principal factors which will affect the performance of the networks. In this paper, we discussed about the impact of varying model width and depth on robustness, the impact of using learnable parametric activation functions (PAFs).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Scientific Research in Computer Science, Engineering and Information Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.