Abstract

Recently, Self-Attention Networks (SANs) have shown its flexibility in parallel computation and effectiveness of modeling both short- and long-term dependencies. However, SANs face two problems: 1) the weighted averaging inhibits relations among neighboring words (i.e., local context); and 2) it calculates dependencies between representations without considering contextual information (i.e., global context). Both local and global contexts have proven useful for modeling dependencies among neural representations in a variety of natural language processing tasks. Accordingly, we augment SANs with the ability of capturing usefully local and global context, and meanwhile maintain their simplicity and flexibility. Firstly, we cast local context modeling as a learnable Gaussian bias, which indicates the central and scope of the local region to be paid more attention. The bias is then incorporated into the original attention distribution to form a revised version. Secondly, we leverage the internal representations that embed sentence-level information as the global context. Specifically, we propose to contextualize the transformations of query and key layers, which are used to calculate the relevance between elements. Since the two approaches are potentially complementary to each other, we propose to combine them to further improve the performance. Empirical results on machine translation and linguistics probing tasks demonstrate the effectiveness and universality of the proposed approaches. Further analyses confirm that our approaches successfully capture contextual information as expected.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.