ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.05166
  4. Cited By
FILTER: An Enhanced Fusion Method for Cross-lingual Language
  Understanding
v1v2v3 (latest)

FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding

10 September 2020
Yuwei Fang
Shuohang Wang
Zhe Gan
S. Sun
Jingjing Liu
    VLM
ArXiv (abs)PDFHTML

Papers citing "FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding"

26 / 26 papers shown
Title
Language Fusion for Parameter-Efficient Cross-lingual Transfer
Language Fusion for Parameter-Efficient Cross-lingual Transfer
Philipp Borchert
Ivan Vulić
Marie-Francine Moens
Jochen De Weerdt
118
0
0
12 Jan 2025
Sharing, Teaching and Aligning: Knowledgeable Transfer Learning for
  Cross-Lingual Machine Reading Comprehension
Sharing, Teaching and Aligning: Knowledgeable Transfer Learning for Cross-Lingual Machine Reading Comprehension
Tingfeng Cao
Chengyu Wang
Chuanqi Tan
Jun Huang
Jinhui Zhu
89
0
0
12 Nov 2023
Zero-shot Cross-lingual Transfer without Parallel Corpus
Zero-shot Cross-lingual Transfer without Parallel Corpus
Yuyang Zhang
Xiaofeng Han
Baojun Wang
VLM
84
0
0
07 Oct 2023
Gradient Sparsification For Masked Fine-Tuning of Transformers
Gradient Sparsification For Masked Fine-Tuning of Transformers
J. Ó. Neill
Sourav Dutta
49
0
0
19 Jul 2023
Free Lunch: Robust Cross-Lingual Transfer via Model Checkpoint Averaging
Free Lunch: Robust Cross-Lingual Transfer via Model Checkpoint Averaging
Fabian David Schmidt
Ivan Vulić
Goran Glavaš
75
9
0
26 May 2023
DualNER: A Dual-Teaching framework for Zero-shot Cross-lingual Named
  Entity Recognition
DualNER: A Dual-Teaching framework for Zero-shot Cross-lingual Named Entity Recognition
Jiali Zeng
Yu Jiang
Yongjing Yin
Xu Wang
Binghuai Lin
Yunbo Cao
80
3
0
15 Nov 2022
Training Dynamics for Curriculum Learning: A Study on Monolingual and
  Cross-lingual NLU
Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU
Fenia Christopoulou
Gerasimos Lampouras
Ignacio Iacobacci
101
4
0
22 Oct 2022
Enhancing Cross-lingual Transfer by Manifold Mixup
Enhancing Cross-lingual Transfer by Manifold Mixup
Huiyun Yang
Huadong Chen
Hao Zhou
Lei Li
AAML
62
45
0
09 May 2022
Nearest Neighbour Few-Shot Learning for Cross-lingual Classification
Nearest Neighbour Few-Shot Learning for Cross-lingual Classification
M Saiful Bari
Batool Haider
Saab Mansour
VLM
36
14
0
06 Sep 2021
Consistency Regularization for Cross-Lingual Fine-Tuning
Consistency Regularization for Cross-Lingual Fine-Tuning
Bo Zheng
Li Dong
Shaohan Huang
Wenhui Wang
Zewen Chi
Saksham Singhal
Wanxiang Che
Ting Liu
Xia Song
Furu Wei
60
58
0
15 Jun 2021
A Semi-supervised Multi-task Learning Approach to Classify Customer
  Contact Intents
A Semi-supervised Multi-task Learning Approach to Classify Customer Contact Intents
Li Dong
Matthew C. Spencer
Amir Biagi
50
3
0
10 Jun 2021
VALUE: A Multi-Task Benchmark for Video-and-Language Understanding
  Evaluation
VALUE: A Multi-Task Benchmark for Video-and-Language Understanding Evaluation
Linjie Li
Jie Lei
Zhe Gan
Licheng Yu
Yen-Chun Chen
...
Tamara L. Berg
Joey Tianyi Zhou
Jingjing Liu
Lijuan Wang
Zicheng Liu
VLM
112
103
0
08 Jun 2021
MergeDistill: Merging Pre-trained Language Models using Distillation
MergeDistill: Merging Pre-trained Language Models using Distillation
Simran Khanuja
Melvin Johnson
Partha P. Talukdar
84
16
0
05 Jun 2021
nmT5 -- Is parallel data still relevant for pre-training massively
  multilingual language models?
nmT5 -- Is parallel data still relevant for pre-training massively multilingual language models?
Mihir Kale
Aditya Siddhant
Noah Constant
Melvin Johnson
Rami Al-Rfou
Linting Xue
LRM
70
25
0
03 Jun 2021
XeroAlign: Zero-Shot Cross-lingual Transformer Alignment
XeroAlign: Zero-Shot Cross-lingual Transformer Alignment
Milan Gritta
Ignacio Iacobacci
80
21
0
06 May 2021
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation
Sebastian Ruder
Noah Constant
Jan A. Botha
Aditya Siddhant
Orhan Firat
...
Pengfei Liu
Junjie Hu
Dan Garrette
Graham Neubig
Melvin Johnson
ELMAAMLLRM
95
190
0
15 Apr 2021
MuRIL: Multilingual Representations for Indian Languages
MuRIL: Multilingual Representations for Indian Languages
Simran Khanuja
Diksha Bansal
Sarvesh Mehtani
Savya Khosla
Atreyee Dey
...
Shachi Dave
Shruti Gupta
Subhash Chandra Bose Gali
Vishnu Subramanian
Partha P. Talukdar
107
292
0
19 Mar 2021
SILT: Efficient transformer training for inter-lingual inference
SILT: Efficient transformer training for inter-lingual inference
Javier Huertas-Tato
Alejandro Martín
David Camacho
48
11
0
17 Mar 2021
Distilling Large Language Models into Tiny and Effective Students using
  pQRNN
Distilling Large Language Models into Tiny and Effective Students using pQRNN
P. Kaliamoorthi
Aditya Siddhant
Edward Li
Melvin Johnson
MQ
60
17
0
21 Jan 2021
Pivot Through English: Reliably Answering Multilingual Questions without
  Document Retrieval
Pivot Through English: Reliably Answering Multilingual Questions without Document Retrieval
Ivan Montero
Shayne Longpre
Ni Lao
Andrew J. Frank
Christopher DuBois
LRM
71
5
0
28 Dec 2020
Globetrotter: Connecting Languages by Connecting Images
Globetrotter: Connecting Languages by Connecting Images
Dídac Surís
Dave Epstein
Carl Vondrick
VLM
74
9
0
08 Dec 2020
VECO: Variable and Flexible Cross-lingual Pre-training for Language
  Understanding and Generation
VECO: Variable and Flexible Cross-lingual Pre-training for Language Understanding and Generation
Fuli Luo
Wei Wang
Jiahao Liu
Yijia Liu
Bin Bi
Songfang Huang
Fei Huang
Luo Si
111
52
0
30 Oct 2020
Rethinking embedding coupling in pre-trained language models
Rethinking embedding coupling in pre-trained language models
Hyung Won Chung
Thibault Févry
Henry Tsai
Melvin Johnson
Sebastian Ruder
177
143
0
24 Oct 2020
DICT-MLM: Improved Multilingual Pre-Training using Bilingual
  Dictionaries
DICT-MLM: Improved Multilingual Pre-Training using Bilingual Dictionaries
Aditi Chaudhary
K. Raman
Krishna Srinivasan
Jiecao Chen
81
25
0
23 Oct 2020
mT5: A massively multilingual pre-trained text-to-text transformer
mT5: A massively multilingual pre-trained text-to-text transformer
Linting Xue
Noah Constant
Adam Roberts
Mihir Kale
Rami Al-Rfou
Aditya Siddhant
Aditya Barua
Colin Raffel
182
2,569
0
22 Oct 2020
TyDi QA: A Benchmark for Information-Seeking Question Answering in
  Typologically Diverse Languages
TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages
J. Clark
Eunsol Choi
Michael Collins
Dan Garrette
Tom Kwiatkowski
Vitaly Nikolaev
J. Palomaki
237
613
0
10 Mar 2020
1