Abstract
Artificial Neural Network (ANN) uses many activation functions to update the state on neuron. The research and engineering have been used activation functions in the artificial neural network as the transfer functions. The most common reasons for using this transfer function were its unit interval boundaries, the functions and quick computability of its derivative, and several useful mathematical properties in the approximation of theory realm. Aim of this study is to figure out the best robust activation functions to accelerate HornSAT logic in the Hopfield Neural Network's context. In this paper we had developed Agent-based Modelling (ABM) assessed the performance of the Zeng Martinez Activation Function (ZMAF) and the Hyperbolic Tangent Activation Function (HTAF) beside the Wan Abdullah method to do Logic Programming (LP) in Hopfield Neural Network (HNN). These assessments are carried out on the basis of hamming distance (HD), the global minima ratio (zM), and CPU time. NETLOGO 5.3.1 software has been used for developing Agent-based Modeling (ABM) to test the proposed comparison of the efficaecy of these two activation functions HTAF and ZMAF.
Highlights
Artificial Neural Network (ANN) is an exciting field in research due to providing an alternative style of doing computation and ANN is a leap towards the understanding of intelligence artificial (AI) [1]
The purpose of this study is to develop Agent-Based Modeling (ABM) to accelerate the performance of doing logic programming in Hopfield network by using two activation functions called Hyperbolic Tangent Activation Function (HTAF) and Zeng Martinez Activation Function (ZMAF)
In this paper we implemented the ZMAF and HTAF to stimulate the performance of doing logical program in Hopfield Neural Network (HNN)
Summary
Artificial Neural Network (ANN) is an exciting field in research due to providing an alternative style of doing computation and ANN is a leap towards the understanding of intelligence artificial (AI) [1]. The common behavior of the neural networks exhibits high-level behavior such as the ability to data recognition, learn, and recall [2]. Hopfield Neural Network (HNN) has many interesting applications, implementations and features such as content addressable memory and fault tolerance [3]. Learning in HNN is accomplished by modifying the strength of the connect between the neurons in the network and the parameter of the Revised Manuscript Received on April 21, 2020.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Engineering and Advanced Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.