Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.12547
Cited By
Multilingual BERT Post-Pretraining Alignment
23 October 2020
Lin Pan
Chung-Wei Hang
Haode Qi
Abhishek Shah
Saloni Potdar
Mo Yu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multilingual BERT Post-Pretraining Alignment"
9 / 9 papers shown
Title
Exploring the Relationship between Alignment and Cross-lingual Transfer in Multilingual Transformers
Félix Gaschi
Patricio Cerda
Parisa Rastin
Y. Toussaint
22
9
0
05 Jun 2023
Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model
Mingqi Li
Fei Ding
Dan Zhang
Long Cheng
Hongxin Hu
Feng Luo
33
6
0
02 Nov 2022
A Simple and Effective Method to Improve Zero-Shot Cross-Lingual Transfer Learning
Kunbo Ding
Weijie Liu
Yuejian Fang
Weiquan Mao
Zhe Zhao
Tao Zhu
Haoyan Liu
Rong Tian
Yiren Chen
35
8
0
18 Oct 2022
When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation?
Zhuoyuan Mao
Chenhui Chu
Raj Dabre
Haiyue Song
Zhen Wan
Sadao Kurohashi
19
3
0
26 Apr 2022
Towards Better Chinese-centric Neural Machine Translation for Low-resource Languages
Bin Li
Yixuan Weng
Fei Xia
Hanjun Deng
22
14
0
09 Apr 2022
CINO: A Chinese Minority Pre-trained Language Model
Ziqing Yang
Zihang Xu
Yiming Cui
Baoxin Wang
Min-Bin Lin
Dayong Wu
Zhigang Chen
21
25
0
28 Feb 2022
Sequential Reptile: Inter-Task Gradient Alignment for Multilingual Learning
Seanie Lee
Haebeom Lee
Juho Lee
Sung Ju Hwang
MoMe
CLL
38
16
0
06 Oct 2021
Improved Text Classification via Contrastive Adversarial Training
Lin Pan
Chung-Wei Hang
Avirup Sil
Saloni Potdar
AAML
18
86
0
21 Jul 2021
MLQA: Evaluating Cross-lingual Extractive Question Answering
Patrick Lewis
Barlas Oğuz
Ruty Rinott
Sebastian Riedel
Holger Schwenk
ELM
246
492
0
16 Oct 2019
1