ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.13869
  4. Cited By
Por Qué Não Utiliser Alla Språk? Mixed Training with Gradient
  Optimization in Few-Shot Cross-Lingual Transfer

Por Qué Não Utiliser Alla Språk? Mixed Training with Gradient Optimization in Few-Shot Cross-Lingual Transfer

29 April 2022
Haoran Xu
Kenton W. Murray
ArXivPDFHTML

Papers citing "Por Qué Não Utiliser Alla Språk? Mixed Training with Gradient Optimization in Few-Shot Cross-Lingual Transfer"

5 / 5 papers shown
Title
GradSim: Gradient-Based Language Grouping for Effective Multilingual
  Training
GradSim: Gradient-Based Language Grouping for Effective Multilingual Training
Mingyang Wang
Heike Adel
Lukas Lange
Jannik Strötgen
Hinrich Schütze
33
3
0
23 Oct 2023
Transfer to a Low-Resource Language via Close Relatives: The Case Study
  on Faroese
Transfer to a Low-Resource Language via Close Relatives: The Case Study on Faroese
Vésteinn Snaebjarnarson
A. Simonsen
Goran Glavavs
Ivan Vulić
37
19
0
18 Apr 2023
Improving Multilingual Translation by Representation and Gradient
  Regularization
Improving Multilingual Translation by Representation and Gradient Regularization
Yilin Yang
Akiko Eriguchi
Alexandre Muzio
Prasad Tadepalli
Stefan Lee
Hany Hassan
47
40
0
10 Sep 2021
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
189
1,639
0
11 Oct 2017
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
371
11,700
0
09 Mar 2017
1