ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.12297
22
5

Word Embedding Transformation for Robust Unsupervised Bilingual Lexicon Induction

26 May 2021
Hailong Cao
Tiejun Zhao
ArXivPDFHTML
Abstract

Great progress has been made in unsupervised bilingual lexicon induction (UBLI) by aligning the source and target word embeddings independently trained on monolingual corpora. The common assumption of most UBLI models is that the embedding spaces of two languages are approximately isomorphic. Therefore the performance is bound by the degree of isomorphism, especially on etymologically and typologically distant languages. To address this problem, we propose a transformation-based method to increase the isomorphism. Embeddings of two languages are made to match with each other by rotating and scaling. The method does not require any form of supervision and can be applied to any language pair. On a benchmark data set of bilingual lexicon induction, our approach can achieve competitive or superior performance compared to state-of-the-art methods, with particularly strong results being found on distant languages.

View on arXiv
Comments on this paper