Abstract

This paper gives a general insight into how the neuron structure in a multilayer perceptron (MLP) can affect the ability of neurons to deal with classification. Most of the common neuron structures are based on monotonic activation functions and linear input mappings. In comparison, the proposed neuron structure utilizes a nonmonotonic activation function and/or a nonlinear input mapping to increase the power of a neuron. An MLP of these high power neurons usually requires a less number of hidden nodes than conventional MLP for solving classification problems. The fewer number of neurons is equivalent to the smaller number of network weights that must be optimally determined by a learning algorithm. The performance of learning algorithm is usually improved by reducing the number of weights, i.e., the dimension of the search space. This usually helps the learning algorithm to escape local optimums, and also, the convergence speed of the algorithm is increased regardless of which algorithm is used for learning. Several 2-dimensional examples are provided manually to visualize how the number of neurons can be reduced by choosing an appropriate neuron structure. Moreover, to show the efficiency of the proposed scheme in solving real-world classification problems, the Iris data classification problem is solved using an MLP whose neurons are equipped by nonmonotonic activation functions, and the result is compared with two well-known monotonic activation functions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.