Abstract

We introduce a neural stack architecture with a differentiable parameterized stack operator approximating stack push and pop operations. We prove the stability of this stack architecture for arbitrarily many stack operations, showing that the state of the neural stack still closely resembles the state of a discrete stack. Using the neural stack with a recurrent neural network, we devise a neural network Pushdown Automaton (nnPDA). A new theoretical bound shows that an nnPDA can recognize any PDA using only finite precision state neurons in finite time. By using two neural stacks to construct a neural tape together with a recurrent neural network, we define a neural network Turing Machine (nnTM). Just like the neural stack, we show these architectures are also stable. Furthermore, we show the nnTM is Turing complete. It requires finite precision state neurons with an arbitrary number of stack neurons to recognize any TM in finite time, thus providing a new and much stronger computational upper bound for neural networks that are Turing complete.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.