Abstract
An Artificial Neural Network with cross-connection is one of the most popular network structures. The structure contains: an input layer, at least one hidden layer and an output layer. Analysing and describing an ANN structure, one usually finds that the first parameter is the number of ANN’s layers. A hierarchical structure is a default and accepted way of describing the network. Using this assumption, the network structure can be described from a different point of view. A set of concepts and models can be used to describe the complexity of ANN’s structure in addition to using a two-level learning algorithm. Implementing the hierarchical structure to the learning algorithm, an ANN structure is divided into sub-networks. Every sub-network is responsible for finding the optimal value of its weight coefficients using a local target function to minimise the learning error. The second coordination level of the learning algorithm is responsible for coordinating the local solutions and finding the minimum of the global target function. In the article a special emphasis is placed on the coordinator’s role in the learning algorithm and its target function. In each iteration the coordinator has to send coordination parameters into the first level of sub-networks. Using the input X and the teaching ?? vectors, the local procedures are working and finding their weight coefficients. At the same step the feedback information is calculated and sent to the coordinator. The process is being repeated until the minimum of local target functions is achieved. As an example, a two-level learning algorithm is used to implement an ANN in the underwriting process for classifying the category of health in a life insurance company.
Highlights
In practice many ANN structures are used but the most popular are the ANNs with forward connections that have a complete or semi-complete set of weight coefficients
In the output layer the linear activation function is The coordinator is described by the Ψ function usually used for approximation tasks
Layers, so input information is not compressed in the hidden For the second layer layers
Summary
In practice many ANN structures are used but the most popular are the ANNs with forward connections that have a complete or semi-complete set of weight coefficients. J=0 structure of an ANN is depicted in (Fig.1) Neurons in both the hidden and the output layers use sigmoid or tanh activation v11i = f1(u1i). In the most common structures hidden layers include more neurons than input. ANN (10-15-8) includes 10 neurons in the input layer, 15 neurons in one hidden layer and 8 in the output one. Variables, laws, principles and terminology by means of which (7) an ANN is described. For such a hierarchical description the functioning on any level should be as independent as possible.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Advanced Research in Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.