Abstract
Equivalence of computational systems can assist in obtaining abstract systems, and thus enable better understanding of issues related their design and performance. For more than four decades, artificial neural networks have been used in many scientific applications to solve classification problems as well as other problems. Since the time of their introduction, multilayer feedforward neural network referred as Ordinary Neural Network (ONN), that contains only summation activation (Sigma) neurons, and multilayer feedforward High-order Neural Network (HONN), that contains Sigma neurons, and product activation (Pi) neurons, have been treated in the literature as different entities. In this work, we studied whether HONNs are mathematically equivalent to ONNs. We have proved that every HONN could be converted to some equivalent ONN. In most cases, one just needs to modify the neuronal transfer function of the Pi neuron to convert it to a Sigma neuron. The theorems that we have derived clearly show that the original HONN and its corresponding equivalent ONN would give exactly the same output, which means; they can both be used to perform exactly the same functionality. We also derived equivalence theorems for several other non-standard neural networks, for example, recurrent HONNs and HONNs with translated multiplicative neurons. This work rejects the hypothesis that HONNs and ONNs are different entities, a conclusion that might initiate a new research frontier in artificial neural network research.
Highlights
Inspired by the biological neuronal systems, several computational intelligent based classification systems have been developed in the past few decades and are widely known as computational neural networks
They introduced their High-order Neural Network (HONN) to solve a challenging computer vision problem known as invariance. (Hughen & Hollon, 1991) stated that HONNs have the advantage of ease training over multilayer perceptrons and claimed to achieve better classification for radar data than a Gaussian classifier. (Jeffries, 1995) employed HONNs for tracking, code recognition and memory management. (Foltyniewicz, 1995) developed a product activation (Pi)-Sigma-Pi network structure for effective recognition of human faces in gray scale irrespective to their position, orientation and scale, and he claimed that it has small number of adjustable weights, rapid learning convergence, and excellent mas.ccsenet.org
HONNs are equivalent to Ordinary Neural Network (ONN), this work rejects the hypothesis that HONNs and ONNs are different entities
Summary
Inspired by the biological neuronal systems, several computational intelligent based classification systems have been developed in the past few decades and are widely known as computational neural networks. (Kosmatopoulos, Polycarpou, Christodoulou, & Ioannou, 1995) showed that by allowing enough high-order connections to a recurrent HONN, they were able to use it in approximating arbitrary dynamical systems. Their explanation is that the dynamic components distribute throughout the network in the form of dynamic neurons. Multiplications are defined for the whole numbers in terms of repeated addition and even multiplications of real numbers are defined by systematic generalization of this basic idea With this in mind, one could, theoretically and/or hypothetically, convert a HONN to a very complicated, large-sized, constrained web-like ONN, but does that mean they are equivalent?
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have