Abstract

We address the following problem: given a set of complex images or a large database, the numerical and computational complexity and quality of approximation for neural network may drastically differ from one activation function to another. A general novel methodology, scaled polynomial constant unit activation function “SPOCU,” is introduced and shown to work satisfactorily on a variety of problems. Moreover, we show that SPOCU can overcome already introduced activation functions with good properties, e.g., SELU and ReLU, on generic problems. In order to explain the good properties of SPOCU, we provide several theoretical and practical motivations, including tissue growth model and memristive cellular nonlinear networks. We also provide estimation strategy for SPOCU parameters and its relation to generation of random type of Sierpinski carpet, related to the [pppq] model. One of the attractive properties of SPOCU is its genuine normalization of the output of layers. We illustrate SPOCU methodology on cancer discrimination, including mammary and prostate cancer and data from Wisconsin Diagnostic Breast Cancer dataset. Moreover, we compared SPOCU with SELU and ReLU on large dataset MNIST, which justifies usefulness of SPOCU by its very good performance.

Highlights

  • Neural computing and activations in neural networks are multidisciplinary topics, which range from neuroscience to theoretical statistical physics

  • We address the following problem: given a set of complex images or a large database, the numerical and computational complexity and quality of approximation for neural network may drastically differ from one activation function to another

  • In order to explain the good properties of scaled polynomial constant unit (SPOCU), we provide several theoretical and practical motivations, including tissue growth model and memristive cellular nonlinear networks

Read more

Summary

Introduction

Neural computing and activations in neural networks are multidisciplinary topics, which range from neuroscience to theoretical statistical physics. The main contribution of this paper is the construction and testing of novel scaled polynomial constant unit (SPOCU) activation function Such a novel activation function relates to complexity patterns through phenomenon of percolation, and it can overcome already introduced activation functions, e.g., SELU and ReLU. SPOCU well contributes to fill the gap in the theories and it is ‘‘picking up’’ the appropriate properties for activation function directly from training of classification on complex patterns, e.g., cancer images. Such efforts can contribute to many applications. 4, we use basic generator from random Sierpinski carpet as a new activation function SPOCU for self-normalizing neural network. We introduce simple computation of matrix expressing the form of fractals, including random SC

Random Sierpinski carpet
Fractals induced by Kronecker product
E YnÀ1bXpn
Geometry and dimension
Percolation threshold
Estimation of a single parameter p
Estimation of two parameters p and q
Theoretical comparison
Experimental comparison
Application: cancer tissue discrimination
Findings
Compliance with ethical standards
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call