ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.14649
  4. Cited By
Learning Contextualised Cross-lingual Word Embeddings and Alignments for
  Extremely Low-Resource Languages Using Parallel Corpora

Learning Contextualised Cross-lingual Word Embeddings and Alignments for Extremely Low-Resource Languages Using Parallel Corpora

27 October 2020
Takashi Wada
Tomoharu Iwata
Yuji Matsumoto
Timothy Baldwin
Jey Han Lau
ArXivPDFHTML

Papers citing "Learning Contextualised Cross-lingual Word Embeddings and Alignments for Extremely Low-Resource Languages Using Parallel Corpora"

5 / 5 papers shown
Title
Cross-Attention is All You Need: Adapting Pretrained Transformers for
  Machine Translation
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation
Mozhdeh Gheini
Xiang Ren
Jonathan May
LRM
20
105
0
18 Apr 2021
Word Alignment by Fine-tuning Embeddings on Parallel Corpora
Word Alignment by Fine-tuning Embeddings on Parallel Corpora
Zi-Yi Dou
Graham Neubig
96
257
0
20 Jan 2021
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
186
1,635
0
11 Oct 2017
A Strong Baseline for Learning Cross-Lingual Word Embeddings from
  Sentence Alignments
A Strong Baseline for Learning Cross-Lingual Word Embeddings from Sentence Alignments
Omer Levy
Anders Søgaard
Yoav Goldberg
32
66
0
18 Aug 2016
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
1