Abstract

Evaluating the number of hidden neurons necessary for solving of pattern recognition and classification tasks is one of the key problems in artificial neural networks. Multilayer perceptron is the most useful artificial neural network to estimate the functional structure in classification. In this paper, we show that artificial neural network with a two hidden layer feed forward neural network with d inputs, d neurons in the first hidden layer, 2d+2 neurons in the second hidden layer, k outputs and with a sigmoidal infinitely differentiable function can solve classification and pattern problems with arbitrary accuracy. This result can be applied to design pattern recognition and classification models with optimal structure in the number of hidden neurons and hidden layers. The experimental results over well-known benchmark datasets show that the convergence and the accuracy of the proposed model of artificial neural network are acceptable. Findings in this paper are experimentally analyzed on four different datasets from machine learning repository.

Highlights

  • The past two decades have seen an enormous change in the field of artificial neural networks and their applications

  • Practicality decreases with the increase in the number of neurons and the number of hidden layers in the neural network model

  • In [27] a new approach was proposed to fix the number of hidden neurons of multi-layer perceptron (MLP) architecture. They proposed an approach that consists of the post training employment of the singular value decomposition and principal component analysis (PCA) method to adjust parameters of network

Read more

Summary

INTRODUCTION

The past two decades have seen an enormous change in the field of artificial neural networks and their applications. In [24, 26], authors proposed to apply the singular value decomposition method in neural networks to estimate the number of hidden neurons. In [27] a new approach was proposed to fix the number of hidden neurons of multi-layer perceptron (MLP) architecture They proposed an approach that consists of the post training employment of the singular value decomposition and principal component analysis (PCA) method to adjust parameters of network. In [28], authors investigated to fix the bounds of the number of hidden neurons in a special type of network, which is called multi-valued multithreshold neural networks. They studied neural network properties with q -valued function defined on some space. A single hidden layer neural network model with r neurons in its hidden layer and the input x x1, , xd evaluates a function of the following form where the weights wi are vectors in d , i – threshold values and ci - the coefficients in real numbers and - is a univariate function, which is called activation function in neural network literature

PROBLEM STATEMENT
THE MAIN RESULT
SIMULATION RESULTS AND DISCUSSION
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.