Abstract
We address the following problem: given a set of complex images or a large database, the numerical and computational complexity and quality of approximation for neural network may drastically differ from one activation function to another. A general novel methodology, scaled polynomial constant unit activation function “SPOCU,” is introduced and shown to work satisfactorily on a variety of problems. Moreover, we show that SPOCU can overcome already introduced activation functions with good properties, e.g., SELU and ReLU, on generic problems. In order to explain the good properties of SPOCU, we provide several theoretical and practical motivations, including tissue growth model and memristive cellular nonlinear networks. We also provide estimation strategy for SPOCU parameters and its relation to generation of random type of Sierpinski carpet, related to the [pppq] model. One of the attractive properties of SPOCU is its genuine normalization of the output of layers. We illustrate SPOCU methodology on cancer discrimination, including mammary and prostate cancer and data from Wisconsin Diagnostic Breast Cancer dataset. Moreover, we compared SPOCU with SELU and ReLU on large dataset MNIST, which justifies usefulness of SPOCU by its very good performance.
Highlights
Neural computing and activations in neural networks are multidisciplinary topics, which range from neuroscience to theoretical statistical physics
We address the following problem: given a set of complex images or a large database, the numerical and computational complexity and quality of approximation for neural network may drastically differ from one activation function to another
In order to explain the good properties of scaled polynomial constant unit (SPOCU), we provide several theoretical and practical motivations, including tissue growth model and memristive cellular nonlinear networks
Summary
Neural computing and activations in neural networks are multidisciplinary topics, which range from neuroscience to theoretical statistical physics. The main contribution of this paper is the construction and testing of novel scaled polynomial constant unit (SPOCU) activation function Such a novel activation function relates to complexity patterns through phenomenon of percolation, and it can overcome already introduced activation functions, e.g., SELU and ReLU. SPOCU well contributes to fill the gap in the theories and it is ‘‘picking up’’ the appropriate properties for activation function directly from training of classification on complex patterns, e.g., cancer images. Such efforts can contribute to many applications. 4, we use basic generator from random Sierpinski carpet as a new activation function SPOCU for self-normalizing neural network. We introduce simple computation of matrix expressing the form of fractals, including random SC
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.