Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.01709
Cited By
Distilling Efficient Language-Specific Models for Cross-Lingual Transfer
2 June 2023
Alan Ansell
Edoardo Ponti
Anna Korhonen
Ivan Vulić
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Efficient Language-Specific Models for Cross-Lingual Transfer"
4 / 4 papers shown
Title
Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation
Jan Christian Blaise Cruz
Alham Fikri Aji
48
1
0
22 Jan 2025
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages
Abteen Ebrahimi
Manuel Mager
Arturo Oncevay
Vishrav Chaudhary
Luis Chiruzzo
...
Graham Neubig
Alexis Palmer
Rolando A. Coto Solano
Ngoc Thang Vu
Katharina Kann
109
72
0
18 Apr 2021
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer
M. Vidoni
Ivan Vulić
Goran Glavas
33
27
0
11 Dec 2020
Rethinking embedding coupling in pre-trained language models
Hyung Won Chung
Thibault Févry
Henry Tsai
Melvin Johnson
Sebastian Ruder
95
142
0
24 Oct 2020
1