Abstract

We consider recurrent neural networks which deal with symbolic formulas, terms, or, generally speaking, tree-structured data. Approaches like the recursive autoassociative memory, discrete-time recurrent networks, folding networks, tensor construction, holographic reduced representations, and recursive reduced descriptions fall into this category. They share the basic dynamics of how structured data are processed: the approaches recursively encode symbolic data into a connectionistic representation or decode symbolic data from a connectionistic representation by means of a simple neural function. In this paper, we give an overview of the ability of neural networks with these dynamics to encode and decode tree-structured symbolic data. The correlated tasks, approximating and learning mappings where the input domain or the output domain may consist of structured symbolic data, are examined as well.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.