Abstract

Sentence compression aims to shorten a sentence into a compression while remaining grammatical and preserving the underlying meaning of the original sentence. Previous works have recognized that linguistic features such as parts-of-speech tags and dependency labels are helpful to compression generation. In this work, we introduce a gating mechanism and propose a gated neural network that selectively exploits linguistic knowledge for deletion-based sentence compression. An extensive experiment was conducted on four downstream datasets, showing that the proposed gated neural network method leads to better compression upon both automatic metrics and human evaluation, compared to previous competitive compression methods. We also observed that the generated compression by the proposed gated neural network share more grammatical relations in common with the ground-truth compression than the baseline method, indicating that important grammatical relations, such as subject or object of a sentence, are more likely to be kept in the compression by the proposed method. Furthermore, visualization analysis is conducted to explore the selective use of linguistic features, suggesting that the gate mechanism could condition the predicted compression on different linguistic features.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.