Abstract

Multilingual Word Embeddings (MWEs) represent words from multiple languages in a single distributional vector space. Unsupervised MWE (UMWE) methods acquire multilingual embeddings without cross-lingual supervision, which is a significant advantage over traditional supervised approaches and opens many new possibilities for low-resource languages. Prior art for learning UMWEs, however, merely relies on a number of independently trained Unsupervised Bilingual Word Embeddings (UBWEs) to obtain multilingual embeddings. These methods fail to leverage the interdependencies that exist among many languages. To address this shortcoming, we propose a fully unsupervised framework for learning MWEs that directly exploits the relations between all language pairs. Our model substantially outperforms previous approaches in the experiments on multilingual word translation and cross-lingual word similarity. In addition, our model even beats supervised approaches trained with cross-lingual resources.

Highlights

  • Continuous distributional word representations (Turian et al, 2010) have become a common technique across a wide variety of NLP tasks

  • We present our unsupervised Multilingual Word Embeddings (MWEs) (UMWE) model that jointly maps the monolingual embeddings of all N languages into a single space by explicitly leveraging the interdependencies between arbitrary language pairs, but is computationally as efficient as learning O(N ) BWEs (instead of O(N 2))

  • We present experimental results to demonstrate the effectiveness of our unsupervised MWE method on two benchmark tasks, the multilingual word translation task, and the SemEval2017 cross-lingual word similarity task

Read more

Summary

Introduction

Continuous distributional word representations (Turian et al, 2010) have become a common technique across a wide variety of NLP tasks. For training BWEs, cross-lingual supervision is required, either in the form of parallel corpora (Klementiev et al, 2012; Zou et al, 2013), or in the form of bilingual lexica (Mikolov et al, 2013a; Xing et al, 2015). This makes learning BWEs for low-resource language pairs much more difficult. Recent work proposes approaches to obtain unsupervised BWEs without relying on any bilingual resources (Zhang et al, 2017; Lample et al, 2018b)

Objectives
Methods
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call