Abstract

The aspect term extraction (ATE) task aims to extract aspect terms describing a part or an attribute of a product from review sentences. Most existing works rely on either general or domain embedding to address this problem. Despite the promising results, the importance of general and domain embeddings is still ignored by most methods, resulting in degraded performances. Besides, word embedding is also related to downstream tasks, and how to regularize word embeddings to capture context-aware information is an unresolved problem. To solve these issues, we first propose context-aware dynamic word embedding (CDWE), which could simultaneously consider general meanings, domain-specific meanings, and the context information of words. Based on CDWE, we propose an attention-based convolution neural network, called ADWE-CNN for ATE, which could adaptively capture the previous meanings of words by utilizing an attention mechanism to assign different importance to the respective embeddings. The experimental results show that ADWE-CNN achieves a comparable performance with the state-of-the-art approaches. Various ablation studies have been conducted to explore the benefit of each component. Our code is publicly available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">http://github.com/xiejiajia2018/ADWE-CNN</uri> .

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.