Abstract

Memory-based learning can be characterized as a lazy learning method in machine learning terminology because it delays the processing of input by storing the input until needed. Linguistic structure parsing, which has been in a performance improvement bottleneck since the latest series of works was presented, determines the syntactic or semantic structure of a sentence. In this article, we construct a memory component and use it to augment a linguistic structure parser which allows the parser to directly extract patterns from the known training treebank to form memory. The experimental results show that existing state-of-the-art parsers reach new heights of performance on the main benchmarks for dependency parsing and semantic role labeling with this memory network.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call