Abstract
Recurrent neural networks can simulate any finite state automata as well as any multi-stack Turing machine. When constraining the network architecture, however, this computational power may no longer hold. For example, recurrent cascade-correlation cannot simulate any finite state automata. Thus, it is important to assess the computational power of a given network architecture, since this characterizes the class of functions which, in principle, can be computed by it. We discuss the computational power of neural networks for structures. Elman-style networks, cascade-correlation networks and neural trees for structures are introduced. We show that Elman-style networks can simulate any frontier-to-root tree automation, while neither cascade-correlation networks nor neural trees can. As a special case of the latter result, we obtain that neural trees for sequences cannot simulate any finite state machine. © 1997 Elsevier Science Ltd. All Rights Reserved.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.