Abstract
In this paper, we study the problem of maximizing an objective function over the discrete set {−1, 1} n using a neural network. It is now known that a binary (two-state) Hopfield network can take, in the worst case, an exponential number of time steps to find even a local maximum of the objective function. In this paper, we carry this argument further by studying theradius of attraction of the global maxima of the objective function. If a binary neural network is used, in general there is no guarantee that a global maximum has a nonzero radius of attraction. In other words, even if the optimization process is started off with the neural network in an initial state that isadjacent to the global maximum, the resulting trajectory of the network may not converge to the nearby maximum, but may instead go off to another maximum. At the same time, another set of recent results shows that, if ananalog neural network is used to optimize the same objective function, thenevery local maximum of the objective function has a nontrivial domain of attraction, and conversely, the only equilibria that are attractive are the local maxima of the objective function. This raises the question as to whether analog neural networks offer some advantages over binary neural networks for optimizing the same objective function. As a motivation for this line of inquiry, we study the problem of decoding an algebraic block code using a neural network. It is shown that the binary neural network implementation has the undesirable property thatall the global maxima of the objective function have azero radius of attraction. In contrast, if an analog neural network is used to maximize exactly the same objective function, the region of attraction of each maximum contains not only the associated “orthant” of the state space, but also some points not in this orthant. In other words, the analog implementation exhibits the desired tolerance to transmission errors, whereas the binary neural network does not have this property. With this motivation, two open questions are posed that provide a program of research for studying the possible superiority of analog neural networks over binary neural networks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.