Abstract

The aspect term extraction (ATE) task aims to extract aspect terms describing a part or an attribute of a product from review sentences. Most existing works rely on either general or domain embedding to address this problem. Despite the promising results, the importance of general and domain embeddings is still ignored by most methods, resulting in degraded performances. Besides, word embedding is also related to downstream tasks, and how to regularize word embeddings to capture context-aware information is an unresolved problem. To solve these issues, we first propose context-aware dynamic word embedding (CDWE), which could simultaneously consider general meanings, domain-specific meanings, and the context information of words. Based on CDWE, we propose an attention-based convolution neural network, called ADWE-CNN for ATE, which could adaptively capture the previous meanings of words by utilizing an attention mechanism to assign different importance to the respective embeddings. The experimental results show that ADWE-CNN achieves a comparable performance with the state-of-the-art approaches. Various ablation studies have been conducted to explore the benefit of each component. Our code is publicly available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">http://github.com/xiejiajia2018/ADWE-CNN</uri> .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call