ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.14218
  4. Cited By
Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models
  via Continual Learning

Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning

29 April 2020
Zihan Liu
Genta Indra Winata
Andrea Madotto
Pascale Fung
    CLL
ArXivPDFHTML

Papers citing "Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning"

5 / 5 papers shown
Title
HOP to the Next Tasks and Domains for Continual Learning in NLP
HOP to the Next Tasks and Domains for Continual Learning in NLP
Umberto Michieli
Mete Ozay
VLM
39
2
0
28 Feb 2024
Memory Efficient Continual Learning with Transformers
Memory Efficient Continual Learning with Transformers
Beyza Ermis
Giovanni Zappella
Martin Wistuba
Aditya Rawal
Cédric Archambeau
CLL
26
42
0
09 Mar 2022
Achieving Forgetting Prevention and Knowledge Transfer in Continual
  Learning
Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning
Zixuan Ke
Bing-Quan Liu
Nianzu Ma
Hu Xu
Lei Shu
CLL
192
122
0
05 Dec 2021
A Primer on Pretrained Multilingual Language Models
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
43
74
0
01 Jul 2021
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
189
1,639
0
11 Oct 2017
1