Abstract
Abstract We develop a computational approach based on memory network for entity disambiguation. The approach automatically finds important clues of a mention from surrounding contexts with attention mechanism, and leverages these clues to facilitate entity disambiguation. Unlike existing feature-based methods, this approach does not rely on any manually designed features. Unlike existing neural models such as recurrent or convolutional neural network, this approach leverages the importance of context words in an explicit way. The model could be easily trained with back-propagation. To effectively learn the model parameters, we automatically collect large-scale mention-entity pairs from Wikipedia as training data. We verify the effectiveness of the proposed approach on a benchmark dataset from TAC-KBP evaluation in 2010. Experimental results demonstrate that our approach empirically surpasses strong feature based and neural network based methods. Model analysis further reveals that our approach has the capacity to discover important clues from contexts.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.