Abstract

One of the influential models in the artificial neural network (ANN) research field for addressing the issue of knowledge in the non-systematic logical rule is Random k Satisfiability. In this context, knowledge structure representation is also the potential application of Random k Satisfiability. Despite many attempts to represent logical rules in a non-systematic structure, previous studies have failed to consider higher-order logical rules. As the amount of information in the logical rule increases, the proposed network is unable to proceed to the retrieval phase, where the behavior of the Random Satisfiability can be observed. This study approaches these issues by proposing higher-order Random k Satisfiability for k ≤ 3 in the Hopfield Neural Network (HNN). In this regard, introducing the 3 Satisfiability logical rule to the existing network increases the synaptic weight dimensions in Lyapunov’s energy function and local field. In this study, we proposed an Election Algorithm (EA) to optimize the learning phase of HNN to compensate for the high computational complexity during the learning phase. This research extensively evaluates the proposed model using various performance metrics. The main findings of this research indicated the compatibility and performance of Random 3 Satisfiability logical representation during the learning and retrieval phase via EA with HNN in terms of error evaluations, energy analysis, similarity indices, and variability measures. The results also emphasized that the proposed Random 3 Satisfiability representation incorporates with EA in HNN is capable to optimize the learning and retrieval phase as compared to the conventional model, which deployed Exhaustive Search (ES).

Highlights

  • A hallmark of any Artificial Neural Network (ANN) is the ability to behave according to the pre-determined output or decision

  • The main findings of this research prove the compatibility of PRAN3SAT logical representation analysis via Election Algorithm (EA) with Hopfield Neural Network (HNN) in both learning and retrieval phase based on error evaluations, energy analysis, similarity measures, and variability

  • The results via computer simulation have extensively shown that the formulated propositional logical rule PRAN3SAT consists of first, second, and third order logical rule has been successfully embedded optimally in Hopfield Neural Network, indicating the flexibility of the logical representation

Read more

Summary

Introduction

A hallmark of any Artificial Neural Network (ANN) is the ability to behave according to the pre-determined output or decision. Without an “optimal” behavior, ANN will result in producing random outputs or decisions, which leads to useless information modelling. ANNs can learn and model complex relationships, which is important to represent real-life problems, it lacks the interpretability of the results obtained and approximated. One of the notable ANN that has an association feature form of learning model is Hopfield Neural Network (HNN) [1]. Despite achieving extraordinary development in various performance metrics for any given problem, the optimal structure of HNN is still debatable among the ANN practitioners. To this end, the choice of the most optimal symbolic structure that governs HNN must be given fair share of attention

Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call