Abstract
The asymptotic function is a very general class of functions which includes the unit step functions, exponential functions, sigmoid functions, and any non-constant function which is convergent in one of the direction. The Generated Hidden Unit Vector (GHUV) is the “state” of the input in the hidden layer, which consists of the output values of hidden units and constant 1. The k hidden units with the asymptotic activation function are able to transfer any given k + 1 different inputs to linearly independent GHUV’s by properly setting weights and thresholds. For the hidden units with Analytic Asymptotic Activation Function (AAAF) and given inputs, this Linearly Independent Transformation (LIT) is with probability 1 ability when setting weights and thresholds randomly. It is a “generic” ability for the weights and thresholds, i.e. the set of weights and thresholds can implement this LIT for the given inputs is open and dense. And it is a “generic” and with probability 1 property for any k + 1 inputs if the weights and thresholds setting has the LIT ability for some k + 1 inputs. Therefor, the k hidden units with random setting transfer almost any k + 1 different inputs to the linear independent GHUV’s using AAAF. That is not true that any nonlinear function could be the activation function, and these nets have LIT ability for any number of inputs. The number of hidden units with this LIT ability for polynomial activation function is limited by the order of polynomials.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.