Abstract
Aspect-based sentiment analysis (ABSA) refers to ascertaining the propensity of sentiment expressed in a text towards a particular aspect. While previous models have utilized dependency graphs and GNNs to facilitate information exchange, they face challenges such as smoothing of aspect representation and a gap between word-based dependency graphs and subword-based BERT. Taking into account the above deficiencies, we argue for a new approach called SRE-BERT that flexibly utilizes syntax knowledge to enhance aspect representations by relying on syntax representations. First, we propose a syntax representation encoder to acquire the syntactic vector for each token. Then, we devise a syntax-guided transformer that employs syntax representation to compute multi-head attention, thereby enabling direct syntactic interaction between any two tokens. Finally, the token-level vectors derived from the syntax-guided transformer are employed to enhance the semantic representations obtained by BERT. In addition, during the aforementioned process, we introduced a Masked POS Label Prediction (MPLP) method to pre-train the syntax encoder. A wide range of studies have been undertaken on data collections covering three distinct fields, and the results indicate that our SRE-BERT outperforms the second-ranked model by 1.97%, 1.55%, and 1.20% on the Rest14, Lap14, and Twitter 3 datasets, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.