ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.04813
21
33

Triangular Architecture for Rare Language Translation

13 May 2018
Shuo Ren
Wenhu Chen
Shujie Liu
Mu Li
M. Zhou
Shuai Ma
ArXivPDFHTML
Abstract

Neural Machine Translation (NMT) performs poor on the low-resource language pair (X,Z)(X,Z)(X,Z), especially when ZZZ is a rare language. By introducing another rich language YYY, we propose a novel triangular training architecture (TA-NMT) to leverage bilingual data (Y,Z)(Y,Z)(Y,Z) (may be small) and (X,Y)(X,Y)(X,Y) (can be rich) to improve the translation performance of low-resource pairs. In this triangular architecture, ZZZ is taken as the intermediate latent variable, and translation models of ZZZ are jointly optimized with a unified bidirectional EM algorithm under the goal of maximizing the translation likelihood of (X,Y)(X,Y)(X,Y). Empirical results demonstrate that our method significantly improves the translation quality of rare languages on MultiUN and IWSLT2012 datasets, and achieves even better performance combining back-translation methods.

View on arXiv
Comments on this paper