Abstract

This paper presents a joint syntactic-semantic embedding model which not only uses syntactic information to enrich the word embeddings but also generates distributed representations for the syntactic structures themselves. The syntactic input to our model comes from a Lexicalized Tree-Adjoining Grammar parser. The word embeddings from our model outperform the Skip-gram embeddings in several word similarity and sentiment classification experiments. The syntactic structure embeddings help improve a transition-based dependency parser by a clear margin.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.