ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.01709
  4. Cited By
Distilling Efficient Language-Specific Models for Cross-Lingual Transfer

Distilling Efficient Language-Specific Models for Cross-Lingual Transfer

2 June 2023
Alan Ansell
E. Ponti
Anna Korhonen
Ivan Vulić
ArXivPDFHTML

Papers citing "Distilling Efficient Language-Specific Models for Cross-Lingual Transfer"

4 / 4 papers shown
Title
Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation
Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation
Jan Christian Blaise Cruz
Alham Fikri Aji
48
1
0
22 Jan 2025
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of
  Pretrained Multilingual Models in Truly Low-resource Languages
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages
Abteen Ebrahimi
Manuel Mager
Arturo Oncevay
Vishrav Chaudhary
Luis Chiruzzo
...
Graham Neubig
Alexis Palmer
Rolando A. Coto Solano
Ngoc Thang Vu
Katharina Kann
109
72
0
18 Apr 2021
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual
  Transfer
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer
M. Vidoni
Ivan Vulić
Goran Glavas
33
27
0
11 Dec 2020
Rethinking embedding coupling in pre-trained language models
Rethinking embedding coupling in pre-trained language models
Hyung Won Chung
Thibault Févry
Henry Tsai
Melvin Johnson
Sebastian Ruder
95
142
0
24 Oct 2020
1