Abstract

Word embedding is the method of representing ambiguous words into word vectors. The existing methods of word embedding are applicable for homonymous words. Constructing word vector of polysemous words is the challenge. The word vector of polysemous words are made by considering context information. The proposed adaptive word embedding technique is discussed in this article. The adaptive word embedding technique is applicable for both homosemous and polysemous words. While representing ambiguous word into word vector the context information is considered. The adaptive word embedding technique generates dynamic word vector for ambiguous word. The word vector with dimension size 198 is created here. There are 198 features are considered in the discussed model. The countable nouns are used as features in adaptive word embedding.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call